Sep 30 18:45:59 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 18:45:59 crc restorecon[4742]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:45:59 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 18:46:00 crc restorecon[4742]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 18:46:00 crc kubenswrapper[4747]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 18:46:00 crc kubenswrapper[4747]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 18:46:00 crc kubenswrapper[4747]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 18:46:00 crc kubenswrapper[4747]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 18:46:00 crc kubenswrapper[4747]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 18:46:00 crc kubenswrapper[4747]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.789547 4747 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802170 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802219 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802229 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802239 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802250 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802259 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802271 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802281 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802291 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802304 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802315 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802325 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802333 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802342 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802351 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802361 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802370 4747 feature_gate.go:330] unrecognized feature gate: Example Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802379 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802387 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802396 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802404 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802413 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802422 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802431 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802441 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802449 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802458 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802466 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802475 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802483 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802492 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802500 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802519 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802529 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802537 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802545 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802554 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802563 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802572 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802580 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802590 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802598 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802610 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802620 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802628 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802636 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802649 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802661 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802672 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802681 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802690 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802699 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802707 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802717 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802725 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802734 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802746 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802755 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802764 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802773 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802781 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802793 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802802 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802810 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802819 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802828 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802836 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802844 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802853 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802862 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.802870 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803062 4747 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803083 4747 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803099 4747 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803112 4747 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803124 4747 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803135 4747 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803147 4747 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803159 4747 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803170 4747 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803180 4747 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803190 4747 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803204 4747 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803217 4747 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803230 4747 flags.go:64] FLAG: --cgroup-root="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803242 4747 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803255 4747 flags.go:64] FLAG: --client-ca-file="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803267 4747 flags.go:64] FLAG: --cloud-config="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803281 4747 flags.go:64] FLAG: --cloud-provider="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803294 4747 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803312 4747 flags.go:64] FLAG: --cluster-domain="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803323 4747 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803333 4747 flags.go:64] FLAG: --config-dir="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803343 4747 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803354 4747 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803367 4747 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803377 4747 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803387 4747 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803398 4747 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803408 4747 flags.go:64] FLAG: --contention-profiling="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803418 4747 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803428 4747 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803439 4747 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803448 4747 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803460 4747 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803471 4747 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803480 4747 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803490 4747 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803500 4747 flags.go:64] FLAG: --enable-server="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803509 4747 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803522 4747 flags.go:64] FLAG: --event-burst="100" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803533 4747 flags.go:64] FLAG: --event-qps="50" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803546 4747 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803559 4747 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803571 4747 flags.go:64] FLAG: --eviction-hard="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803586 4747 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803599 4747 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803610 4747 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803622 4747 flags.go:64] FLAG: --eviction-soft="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803632 4747 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803642 4747 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803652 4747 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803662 4747 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803671 4747 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803681 4747 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803691 4747 flags.go:64] FLAG: --feature-gates="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803705 4747 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803717 4747 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803727 4747 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803737 4747 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803748 4747 flags.go:64] FLAG: --healthz-port="10248" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803759 4747 flags.go:64] FLAG: --help="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803769 4747 flags.go:64] FLAG: --hostname-override="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803779 4747 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803789 4747 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803799 4747 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803811 4747 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803823 4747 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803836 4747 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803848 4747 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803860 4747 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803873 4747 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803886 4747 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803899 4747 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803911 4747 flags.go:64] FLAG: --kube-reserved="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803953 4747 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803964 4747 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803975 4747 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803984 4747 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.803995 4747 flags.go:64] FLAG: --lock-file="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804004 4747 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804014 4747 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804026 4747 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804053 4747 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804066 4747 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804076 4747 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804086 4747 flags.go:64] FLAG: --logging-format="text" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804098 4747 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804111 4747 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804121 4747 flags.go:64] FLAG: --manifest-url="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804133 4747 flags.go:64] FLAG: --manifest-url-header="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804149 4747 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804162 4747 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804175 4747 flags.go:64] FLAG: --max-pods="110" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804187 4747 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804198 4747 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804209 4747 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804221 4747 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804232 4747 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804244 4747 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804255 4747 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804281 4747 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804293 4747 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804303 4747 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804313 4747 flags.go:64] FLAG: --pod-cidr="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804323 4747 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804337 4747 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804347 4747 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804357 4747 flags.go:64] FLAG: --pods-per-core="0" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804367 4747 flags.go:64] FLAG: --port="10250" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804377 4747 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804386 4747 flags.go:64] FLAG: --provider-id="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804396 4747 flags.go:64] FLAG: --qos-reserved="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804406 4747 flags.go:64] FLAG: --read-only-port="10255" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804416 4747 flags.go:64] FLAG: --register-node="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804428 4747 flags.go:64] FLAG: --register-schedulable="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804437 4747 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804454 4747 flags.go:64] FLAG: --registry-burst="10" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804464 4747 flags.go:64] FLAG: --registry-qps="5" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804473 4747 flags.go:64] FLAG: --reserved-cpus="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804484 4747 flags.go:64] FLAG: --reserved-memory="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804497 4747 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804507 4747 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804517 4747 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804527 4747 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804536 4747 flags.go:64] FLAG: --runonce="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804546 4747 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804556 4747 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804567 4747 flags.go:64] FLAG: --seccomp-default="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804576 4747 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804586 4747 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804596 4747 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804606 4747 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804616 4747 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804626 4747 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804636 4747 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804646 4747 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804655 4747 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804665 4747 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804676 4747 flags.go:64] FLAG: --system-cgroups="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804685 4747 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804699 4747 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804709 4747 flags.go:64] FLAG: --tls-cert-file="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804718 4747 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804731 4747 flags.go:64] FLAG: --tls-min-version="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804741 4747 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804751 4747 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804761 4747 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804770 4747 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804781 4747 flags.go:64] FLAG: --v="2" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804795 4747 flags.go:64] FLAG: --version="false" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804807 4747 flags.go:64] FLAG: --vmodule="" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804819 4747 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.804829 4747 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805164 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805188 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805205 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805217 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805229 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805242 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805255 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805265 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805275 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805283 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805291 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805300 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805308 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805318 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805326 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805334 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805343 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805355 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805364 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805373 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805382 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805391 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805399 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805408 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805423 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805432 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805441 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805449 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805457 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805465 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805474 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805485 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805498 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805511 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805522 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805533 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805544 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805555 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805567 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805578 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805589 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805598 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805607 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805616 4747 feature_gate.go:330] unrecognized feature gate: Example Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805624 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805632 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805641 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805649 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805658 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805666 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805674 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805683 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805691 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805700 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805708 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805716 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805729 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805738 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805747 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805755 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805763 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805775 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805786 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805796 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805808 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805818 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805827 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805836 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805845 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805854 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.805862 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.805889 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.816159 4747 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.816194 4747 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816298 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816306 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816313 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816318 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816324 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816329 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816334 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816339 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816346 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816354 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816360 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816365 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816371 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816377 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816383 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816388 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816393 4747 feature_gate.go:330] unrecognized feature gate: Example Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816398 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816403 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816408 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816413 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816418 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816423 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816430 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816436 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816441 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816445 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816450 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816456 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816461 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816467 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816472 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816479 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816485 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816493 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816498 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816503 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816508 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816513 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816518 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816523 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816528 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816533 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816537 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816542 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816547 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816552 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816557 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816561 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816566 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816571 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816576 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816580 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816585 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816591 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816596 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816600 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816605 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816610 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816615 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816620 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816625 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816630 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816635 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816640 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816645 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816650 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816654 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816661 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816667 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816674 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.816683 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816852 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816861 4747 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816868 4747 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816876 4747 feature_gate.go:330] unrecognized feature gate: Example Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816881 4747 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816887 4747 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816892 4747 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816897 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816903 4747 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816909 4747 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816914 4747 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816919 4747 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816940 4747 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816946 4747 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816950 4747 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816956 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816961 4747 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816992 4747 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.816997 4747 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817002 4747 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817007 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817012 4747 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817017 4747 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817022 4747 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817027 4747 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817032 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817037 4747 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817041 4747 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817046 4747 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817051 4747 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817058 4747 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817064 4747 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817070 4747 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817075 4747 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817083 4747 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817090 4747 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817097 4747 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817103 4747 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817109 4747 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817116 4747 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817122 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817129 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817136 4747 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817144 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817151 4747 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817158 4747 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817164 4747 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817172 4747 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817178 4747 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817184 4747 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817191 4747 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817197 4747 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817203 4747 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817211 4747 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817218 4747 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817225 4747 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817231 4747 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817238 4747 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817244 4747 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817250 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817256 4747 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817262 4747 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817268 4747 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817274 4747 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817280 4747 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817286 4747 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817292 4747 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817298 4747 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817304 4747 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817308 4747 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.817314 4747 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.817323 4747 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.817541 4747 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.822039 4747 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.822134 4747 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.823480 4747 server.go:997] "Starting client certificate rotation" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.823507 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.825471 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-21 03:32:24.973855939 +0000 UTC Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.825563 4747 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1952h46m24.148297744s for next certificate rotation Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.849476 4747 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.855205 4747 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.879304 4747 log.go:25] "Validated CRI v1 runtime API" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.920761 4747 log.go:25] "Validated CRI v1 image API" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.922990 4747 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.931771 4747 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-18-41-27-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.931818 4747 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.959912 4747 manager.go:217] Machine: {Timestamp:2025-09-30 18:46:00.954978113 +0000 UTC m=+0.614458307 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:654e05b7-6acc-4d21-b8da-ee5f38eb9a9f BootID:37988aed-caa1-4cf6-8704-8dc8a1aec71e Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:db:8a:c3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:db:8a:c3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c6:0a:fd Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e8:35:7b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:42:f1:45 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6b:38:42 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:1f:bb:07 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:32:b8:f9:93:57:c4 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0e:8b:91:c1:33:94 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.960645 4747 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.960888 4747 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.962672 4747 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.963099 4747 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.963154 4747 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.963452 4747 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.963468 4747 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.964023 4747 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.964068 4747 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.964276 4747 state_mem.go:36] "Initialized new in-memory state store" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.964394 4747 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.969087 4747 kubelet.go:418] "Attempting to sync node with API server" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.969126 4747 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.969161 4747 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.969183 4747 kubelet.go:324] "Adding apiserver pod source" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.969202 4747 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.976167 4747 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.978502 4747 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.979159 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:00 crc kubenswrapper[4747]: W0930 18:46:00.979179 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:00 crc kubenswrapper[4747]: E0930 18:46:00.979460 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:00 crc kubenswrapper[4747]: E0930 18:46:00.979499 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.982601 4747 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985009 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985057 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985077 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985097 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985123 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985137 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985161 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985183 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985200 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985214 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985232 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.985246 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.987989 4747 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.988844 4747 server.go:1280] "Started kubelet" Sep 30 18:46:00 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.995115 4747 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.995315 4747 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.996151 4747 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 18:46:00 crc kubenswrapper[4747]: I0930 18:46:00.998690 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.001993 4747 server.go:460] "Adding debug handlers to kubelet server" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.002790 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.002854 4747 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.003008 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 08:59:03.655970725 +0000 UTC Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.003097 4747 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1310h13m2.652880508s for next certificate rotation Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.001905 4747 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a23cb6903dc6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 18:46:00.988793965 +0000 UTC m=+0.648274119,LastTimestamp:2025-09-30 18:46:00.988793965 +0000 UTC m=+0.648274119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.003287 4747 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.003324 4747 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.003349 4747 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.003459 4747 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 18:46:01 crc kubenswrapper[4747]: W0930 18:46:01.004194 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.004327 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.004440 4747 factory.go:55] Registering systemd factory Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.004473 4747 factory.go:221] Registration of the systemd container factory successfully Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.005742 4747 factory.go:153] Registering CRI-O factory Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.005780 4747 factory.go:221] Registration of the crio container factory successfully Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.005896 4747 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.005991 4747 factory.go:103] Registering Raw factory Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.006028 4747 manager.go:1196] Started watching for new ooms in manager Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.006638 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.007116 4747 manager.go:319] Starting recovery of all containers Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026012 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026113 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026156 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026184 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026216 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026242 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026267 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026299 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026329 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026362 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026391 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026420 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026453 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026491 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026514 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026542 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026575 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026600 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026638 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026664 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026717 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026752 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026780 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026819 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026848 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026877 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.026953 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027000 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027030 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027065 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027093 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027122 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027169 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027195 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027230 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027257 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027282 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027319 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027344 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027378 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027406 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027435 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027470 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027499 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027531 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027578 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.027608 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031552 4747 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031644 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031679 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031709 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031749 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031777 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031828 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031872 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031906 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.031980 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032011 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032049 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032074 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032112 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032144 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032180 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032208 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032239 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032277 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032304 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032330 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032365 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032393 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032427 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032454 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032481 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032518 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032544 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032578 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032606 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032635 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032669 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032694 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032752 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032777 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032805 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032835 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032886 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032921 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.032989 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033018 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033054 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033079 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033105 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033138 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033163 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033201 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033227 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033252 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033308 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033335 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033369 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033409 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033432 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033467 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033492 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033524 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033548 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033598 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033662 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033693 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033847 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033893 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.033961 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034049 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034115 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034155 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034223 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034262 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034299 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034326 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034352 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034407 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034438 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034473 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034506 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034533 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034589 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.034620 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036100 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036175 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036226 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036254 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036280 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036313 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036362 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036398 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036426 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036450 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036481 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036507 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036580 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036661 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036686 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036714 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036749 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036775 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.036803 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037041 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037064 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037093 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037112 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037157 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037182 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037204 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037232 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037251 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037278 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037299 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037320 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037362 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037385 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037406 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037434 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037457 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037482 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037504 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037525 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037564 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037586 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037624 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037645 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037665 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037690 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037712 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037737 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037757 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037781 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037807 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037828 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037856 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037875 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037899 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037962 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.037984 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038009 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038028 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038048 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038076 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038095 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038116 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038142 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038161 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038186 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038207 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038226 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038253 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038275 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038317 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038337 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038357 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038381 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038400 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038427 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038450 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038471 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038495 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038515 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038595 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038619 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038640 4747 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038679 4747 reconstruct.go:97] "Volume reconstruction finished" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.038694 4747 reconciler.go:26] "Reconciler: start to sync state" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.045447 4747 manager.go:324] Recovery completed Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.065018 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.067476 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.067667 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.067786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.068664 4747 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.068753 4747 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.068882 4747 state_mem.go:36] "Initialized new in-memory state store" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.081826 4747 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.084498 4747 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.085300 4747 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.085443 4747 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.085968 4747 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.085508 4747 policy_none.go:49] "None policy: Start" Sep 30 18:46:01 crc kubenswrapper[4747]: W0930 18:46:01.086190 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.086590 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.088005 4747 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.088113 4747 state_mem.go:35] "Initializing new in-memory state store" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.103714 4747 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.154657 4747 manager.go:334] "Starting Device Plugin manager" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.154736 4747 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.154753 4747 server.go:79] "Starting device plugin registration server" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.155296 4747 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.155334 4747 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.155612 4747 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.155733 4747 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.155746 4747 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.163618 4747 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.186727 4747 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.186842 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.188156 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.188214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.188236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.188413 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.188850 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.188920 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.189525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.189593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.189614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.189799 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.189881 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.189911 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.190325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.190365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.190377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.191145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.191186 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.191200 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.191916 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.191966 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.191978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.192112 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.192308 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.192347 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.192811 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.192850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.192956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.193146 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.193252 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.193309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.193329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.193339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.193379 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.194053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.194099 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.194116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.194371 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.194421 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.195717 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.195767 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.195731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.195782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.195797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.195818 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.207788 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.244358 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.244449 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.244498 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.244685 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.244799 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.244876 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.244965 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.245001 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.245023 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.245041 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.245066 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.245083 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.245096 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.245121 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.245502 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.256529 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.260671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.260720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.260731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.260784 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.261319 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347125 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347184 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347249 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347274 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347299 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347323 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347344 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347346 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347440 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347373 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347357 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347446 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347496 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347430 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347535 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347543 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347545 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347547 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347608 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347562 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347569 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347720 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347760 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347791 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347797 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347688 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347855 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.347951 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.462075 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.463452 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.463501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.463510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.463536 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.463983 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.515278 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.523854 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.552353 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: W0930 18:46:01.570841 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c6bc694b5c614a94d72c5958434d2a22f190a127140ca7f29e917117217367fa WatchSource:0}: Error finding container c6bc694b5c614a94d72c5958434d2a22f190a127140ca7f29e917117217367fa: Status 404 returned error can't find the container with id c6bc694b5c614a94d72c5958434d2a22f190a127140ca7f29e917117217367fa Sep 30 18:46:01 crc kubenswrapper[4747]: W0930 18:46:01.574205 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e1e3d8704781379f792929481f11df07673dfe856df9bd35b8e76077983c06bf WatchSource:0}: Error finding container e1e3d8704781379f792929481f11df07673dfe856df9bd35b8e76077983c06bf: Status 404 returned error can't find the container with id e1e3d8704781379f792929481f11df07673dfe856df9bd35b8e76077983c06bf Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.581801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: W0930 18:46:01.583834 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b846e359cc510ada40501f436622c04d401cefa27713966f219be0ea0465f14c WatchSource:0}: Error finding container b846e359cc510ada40501f436622c04d401cefa27713966f219be0ea0465f14c: Status 404 returned error can't find the container with id b846e359cc510ada40501f436622c04d401cefa27713966f219be0ea0465f14c Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.587036 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.609220 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Sep 30 18:46:01 crc kubenswrapper[4747]: W0930 18:46:01.611978 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-4455a1676d6f32fc543bbbed37eeef3d76b84aab0f44ac5f63c820c9c65b4e21 WatchSource:0}: Error finding container 4455a1676d6f32fc543bbbed37eeef3d76b84aab0f44ac5f63c820c9c65b4e21: Status 404 returned error can't find the container with id 4455a1676d6f32fc543bbbed37eeef3d76b84aab0f44ac5f63c820c9c65b4e21 Sep 30 18:46:01 crc kubenswrapper[4747]: W0930 18:46:01.617840 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-47f25ca1db5e6fcab3b12ee65e6ec32f99e0fb4c00c50b66cb2a36f7f63db668 WatchSource:0}: Error finding container 47f25ca1db5e6fcab3b12ee65e6ec32f99e0fb4c00c50b66cb2a36f7f63db668: Status 404 returned error can't find the container with id 47f25ca1db5e6fcab3b12ee65e6ec32f99e0fb4c00c50b66cb2a36f7f63db668 Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.865180 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.867233 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.867302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.867322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:01 crc kubenswrapper[4747]: I0930 18:46:01.867364 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 18:46:01 crc kubenswrapper[4747]: E0930 18:46:01.868038 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.000283 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:02 crc kubenswrapper[4747]: W0930 18:46:02.008070 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:02 crc kubenswrapper[4747]: E0930 18:46:02.008181 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.090955 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b846e359cc510ada40501f436622c04d401cefa27713966f219be0ea0465f14c"} Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.093107 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e1e3d8704781379f792929481f11df07673dfe856df9bd35b8e76077983c06bf"} Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.094279 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c6bc694b5c614a94d72c5958434d2a22f190a127140ca7f29e917117217367fa"} Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.095608 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"47f25ca1db5e6fcab3b12ee65e6ec32f99e0fb4c00c50b66cb2a36f7f63db668"} Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.096877 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4455a1676d6f32fc543bbbed37eeef3d76b84aab0f44ac5f63c820c9c65b4e21"} Sep 30 18:46:02 crc kubenswrapper[4747]: W0930 18:46:02.399732 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:02 crc kubenswrapper[4747]: E0930 18:46:02.399851 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:02 crc kubenswrapper[4747]: E0930 18:46:02.410881 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Sep 30 18:46:02 crc kubenswrapper[4747]: W0930 18:46:02.467521 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:02 crc kubenswrapper[4747]: E0930 18:46:02.467642 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:02 crc kubenswrapper[4747]: W0930 18:46:02.507232 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:02 crc kubenswrapper[4747]: E0930 18:46:02.507373 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.668463 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.670354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.670398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.670409 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.670434 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 18:46:02 crc kubenswrapper[4747]: E0930 18:46:02.670795 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Sep 30 18:46:02 crc kubenswrapper[4747]: I0930 18:46:02.999755 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.102594 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc" exitCode=0 Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.102699 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc"} Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.102864 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.105973 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.106035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.106053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.107171 4747 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4" exitCode=0 Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.107246 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4"} Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.107315 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.108489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.108528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.108540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.108728 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.110481 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.110513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.110635 4747 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085" exitCode=0 Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.110733 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.110723 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085"} Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.110735 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.112232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.112266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.112281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.114400 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1" exitCode=0 Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.114472 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.114509 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1"} Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.115994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.116071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.116864 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.118798 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098"} Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.118834 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5"} Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.118846 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b"} Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.118856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637"} Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.118914 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.119451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.119473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.119484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:03 crc kubenswrapper[4747]: W0930 18:46:03.871536 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:03 crc kubenswrapper[4747]: E0930 18:46:03.871619 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:03 crc kubenswrapper[4747]: I0930 18:46:03.999820 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:04 crc kubenswrapper[4747]: E0930 18:46:04.011756 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.124616 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"17c682b33e155a43e48d8173b084a93df1a6badd45c3c1fc9dbeb8daa9959952"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.124711 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.126700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.126745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.126762 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.130859 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.130902 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.130946 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.130966 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.132024 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.132063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.132080 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.133141 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.133325 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2" exitCode=0 Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.133421 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.133980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.134026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.134043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.145379 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.145455 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.145472 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.145480 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.145588 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174"} Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.147207 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.147269 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.147289 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.271639 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.272679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.272716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.272729 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:04 crc kubenswrapper[4747]: I0930 18:46:04.272751 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 18:46:04 crc kubenswrapper[4747]: E0930 18:46:04.273176 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Sep 30 18:46:04 crc kubenswrapper[4747]: W0930 18:46:04.576171 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:04 crc kubenswrapper[4747]: E0930 18:46:04.576281 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:04 crc kubenswrapper[4747]: W0930 18:46:04.621384 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Sep 30 18:46:04 crc kubenswrapper[4747]: E0930 18:46:04.621523 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.151819 4747 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7" exitCode=0 Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.151978 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7"} Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.152025 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.155168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.155229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.155253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.157754 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d"} Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.157822 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.157905 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.157919 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.159258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.159298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.159314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.159732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.159794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.159819 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.160486 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.161658 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.161708 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.161726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.608383 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.658632 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.755879 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.955774 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.955987 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.957440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.957474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:05 crc kubenswrapper[4747]: I0930 18:46:05.957486 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.164590 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.165266 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147"} Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.165302 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7"} Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.165317 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2"} Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.165385 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.165415 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.166074 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.166099 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.166109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.169316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.169355 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.169368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.510720 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.511050 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.512787 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.513004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.513039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:06 crc kubenswrapper[4747]: I0930 18:46:06.520873 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.174991 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2"} Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.175086 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e"} Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.175130 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.175245 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.175126 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.175586 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.176533 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.176584 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.176610 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.177106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.177140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.177150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.177343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.177403 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.177457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.474200 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.476410 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.476472 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.476496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:07 crc kubenswrapper[4747]: I0930 18:46:07.476535 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.176839 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.177721 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.177754 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.179230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.179293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.179319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.179367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.179406 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:08 crc kubenswrapper[4747]: I0930 18:46:08.179446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:09 crc kubenswrapper[4747]: I0930 18:46:09.994031 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 18:46:09 crc kubenswrapper[4747]: I0930 18:46:09.994291 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:09 crc kubenswrapper[4747]: I0930 18:46:09.996401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:09 crc kubenswrapper[4747]: I0930 18:46:09.996457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:09 crc kubenswrapper[4747]: I0930 18:46:09.996474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:11 crc kubenswrapper[4747]: E0930 18:46:11.163782 4747 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.278489 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.278793 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.280357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.280396 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.280413 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.340358 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.340660 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.342570 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.342624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:11 crc kubenswrapper[4747]: I0930 18:46:11.342644 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:12 crc kubenswrapper[4747]: I0930 18:46:12.768109 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 18:46:12 crc kubenswrapper[4747]: I0930 18:46:12.768410 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:12 crc kubenswrapper[4747]: I0930 18:46:12.770206 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:12 crc kubenswrapper[4747]: I0930 18:46:12.770275 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:12 crc kubenswrapper[4747]: I0930 18:46:12.770300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:14 crc kubenswrapper[4747]: I0930 18:46:14.341219 4747 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 18:46:14 crc kubenswrapper[4747]: I0930 18:46:14.341361 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 18:46:15 crc kubenswrapper[4747]: I0930 18:46:15.000668 4747 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 18:46:15 crc kubenswrapper[4747]: W0930 18:46:15.571538 4747 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 18:46:15 crc kubenswrapper[4747]: I0930 18:46:15.571663 4747 trace.go:236] Trace[755676341]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 18:46:05.570) (total time: 10001ms): Sep 30 18:46:15 crc kubenswrapper[4747]: Trace[755676341]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:46:15.571) Sep 30 18:46:15 crc kubenswrapper[4747]: Trace[755676341]: [10.001368039s] [10.001368039s] END Sep 30 18:46:15 crc kubenswrapper[4747]: E0930 18:46:15.571694 4747 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 18:46:15 crc kubenswrapper[4747]: I0930 18:46:15.659123 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 18:46:15 crc kubenswrapper[4747]: I0930 18:46:15.659224 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 18:46:16 crc kubenswrapper[4747]: I0930 18:46:16.069708 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 18:46:16 crc kubenswrapper[4747]: I0930 18:46:16.069788 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 18:46:18 crc kubenswrapper[4747]: I0930 18:46:18.184378 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:18 crc kubenswrapper[4747]: I0930 18:46:18.184563 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:18 crc kubenswrapper[4747]: I0930 18:46:18.186060 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:18 crc kubenswrapper[4747]: I0930 18:46:18.186115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:18 crc kubenswrapper[4747]: I0930 18:46:18.186133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:20 crc kubenswrapper[4747]: I0930 18:46:20.666354 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:20 crc kubenswrapper[4747]: I0930 18:46:20.666518 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:20 crc kubenswrapper[4747]: I0930 18:46:20.667650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:20 crc kubenswrapper[4747]: I0930 18:46:20.667721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:20 crc kubenswrapper[4747]: I0930 18:46:20.667745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:20 crc kubenswrapper[4747]: I0930 18:46:20.671158 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:21 crc kubenswrapper[4747]: E0930 18:46:21.059776 4747 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.063761 4747 trace.go:236] Trace[887670164]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 18:46:10.917) (total time: 10146ms): Sep 30 18:46:21 crc kubenswrapper[4747]: Trace[887670164]: ---"Objects listed" error: 10146ms (18:46:21.063) Sep 30 18:46:21 crc kubenswrapper[4747]: Trace[887670164]: [10.146174331s] [10.146174331s] END Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.063808 4747 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.065881 4747 trace.go:236] Trace[114913583]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 18:46:09.503) (total time: 11562ms): Sep 30 18:46:21 crc kubenswrapper[4747]: Trace[114913583]: ---"Objects listed" error: 11562ms (18:46:21.065) Sep 30 18:46:21 crc kubenswrapper[4747]: Trace[114913583]: [11.562719435s] [11.562719435s] END Sep 30 18:46:21 crc kubenswrapper[4747]: E0930 18:46:21.065900 4747 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.065916 4747 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.066354 4747 trace.go:236] Trace[518329132]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 18:46:10.155) (total time: 10910ms): Sep 30 18:46:21 crc kubenswrapper[4747]: Trace[518329132]: ---"Objects listed" error: 10910ms (18:46:21.066) Sep 30 18:46:21 crc kubenswrapper[4747]: Trace[518329132]: [10.910424908s] [10.910424908s] END Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.066385 4747 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.069686 4747 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.106365 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35678->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.106411 4747 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.106446 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35678->192.168.126.11:17697: read: connection reset by peer" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.106476 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.217326 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.219573 4747 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d" exitCode=255 Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.219618 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d"} Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.280302 4747 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.335484 4747 scope.go:117] "RemoveContainer" containerID="7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.350731 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.355755 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.982432 4747 apiserver.go:52] "Watching apiserver" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.988887 4747 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.989299 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.989872 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:21 crc kubenswrapper[4747]: E0930 18:46:21.989951 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.990335 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.990471 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.990568 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:21 crc kubenswrapper[4747]: E0930 18:46:21.990700 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.991006 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.991096 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:21 crc kubenswrapper[4747]: E0930 18:46:21.991195 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.993275 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.994341 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.994790 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.995060 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.995235 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.995327 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.995398 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.997432 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 18:46:21 crc kubenswrapper[4747]: I0930 18:46:21.997554 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.004411 4747 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.022975 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.032854 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.047056 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.056116 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.064712 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.073500 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078528 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078630 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078687 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078756 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078765 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078805 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078824 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078842 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078859 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078877 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078893 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078912 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078946 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078961 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078976 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.078990 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079003 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079028 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079056 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079053 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079076 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079113 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079141 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079165 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079184 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079202 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079222 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079240 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079256 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079254 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079281 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079294 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079298 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079344 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079363 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079379 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079396 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079413 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079430 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079446 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079466 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079483 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079484 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079494 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079505 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079525 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079543 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079559 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079577 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079592 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079608 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079622 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079637 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079654 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079669 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079684 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079700 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079718 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079733 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079749 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079764 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079779 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079796 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079820 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079840 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079859 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079877 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079892 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079977 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079995 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080011 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080027 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080042 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080059 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080075 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080090 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080106 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080122 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080139 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080158 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080175 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080193 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080210 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080227 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080243 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080287 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080303 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080320 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080337 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080353 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080369 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080385 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080430 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080446 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080465 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080481 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080496 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080513 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080541 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080556 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080574 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080591 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080605 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080621 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080638 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080654 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080670 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080685 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080701 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080716 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080731 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080746 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080762 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080778 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080794 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080810 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080825 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080840 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080855 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080870 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080886 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080904 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080935 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080954 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080973 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080994 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081013 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081032 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081050 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081069 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081084 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081099 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081115 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081133 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079493 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081255 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079548 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079697 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079745 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079874 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.079984 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080084 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080100 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080163 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080273 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081381 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080325 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080352 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080721 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.080953 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081012 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081120 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081438 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081219 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081189 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081469 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081505 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081524 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081543 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081565 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081581 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081583 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081599 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081648 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081695 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081721 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081735 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081780 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081815 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081850 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081888 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081946 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082062 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082100 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082144 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082184 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082219 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082255 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082290 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082324 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082358 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082392 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082426 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082460 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082493 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082536 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082571 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082606 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082645 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082681 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082718 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082753 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082790 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082827 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082861 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082897 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082955 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082990 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083026 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083063 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083101 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083137 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083172 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083207 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083245 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083315 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083350 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083384 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083419 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083453 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083488 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083522 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083555 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083592 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083627 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083661 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083697 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083731 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083765 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083801 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083836 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083872 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.083960 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084000 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084038 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084079 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084119 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084189 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084237 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084275 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084356 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084391 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084447 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084519 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084580 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084637 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084679 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084732 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084780 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084830 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084981 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085020 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085052 4747 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085083 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085115 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085148 4747 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085179 4747 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085208 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085240 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085302 4747 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085333 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085368 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085400 4747 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085434 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085474 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085512 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092627 4747 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081740 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.081823 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082006 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082079 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082052 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082183 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082196 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082243 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082270 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082306 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082457 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082512 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082559 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082673 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082689 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.095894 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082762 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082994 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.082975 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084153 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084385 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.084789 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085111 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085171 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085655 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.085771 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.086142 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.086350 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.086720 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.086982 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.087011 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.087077 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.087525 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.087696 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.087981 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.088059 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.088185 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.088210 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.088232 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.088597 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.088961 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.089227 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.088332 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.089260 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.089267 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.089495 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.089699 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.089856 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.089942 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.090097 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.090277 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.090346 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.090617 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.090654 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.091028 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.091618 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.091642 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.091760 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.091833 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.091899 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092134 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092170 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.091717 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092210 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092219 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092399 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092454 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092343 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092759 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092767 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092779 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.092821 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.093075 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.093231 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.093331 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.093441 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.093466 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.093615 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.093715 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.093949 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.094067 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.094278 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.095130 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.095684 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.096301 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.096316 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.096565 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.096660 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.096659 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.096746 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.096744 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.097073 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.097122 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.097153 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.097174 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.097750 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.097946 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:46:22.597848881 +0000 UTC m=+22.257329085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.098069 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.098284 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.098295 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.098542 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.098647 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.099080 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.099105 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.099326 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.099400 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.099775 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.100078 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.101945 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:22.601905973 +0000 UTC m=+22.261386327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.102393 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.102410 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:22.602397829 +0000 UTC m=+22.261878173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.102546 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.102572 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.102588 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.102672 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:22.602651267 +0000 UTC m=+22.262131611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.089546 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.107176 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.115337 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.115511 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.115581 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.115675 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:22.61565761 +0000 UTC m=+22.275137724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.122059 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.123623 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.124679 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.125471 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.125528 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.125589 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.125579 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.125855 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.126367 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.126513 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.126682 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.127149 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.127173 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.127236 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.127397 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.127778 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.128027 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.128853 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.130072 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.131772 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.132245 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.132299 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.132372 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.133155 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.135172 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.148632 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.136020 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.149328 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.149384 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.149553 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.149947 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.150082 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.150114 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.150238 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.150319 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.150404 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.150436 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.133353 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.151047 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.151952 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.155304 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.157092 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.157160 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.157225 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.157616 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.157692 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.157745 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.160305 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.161799 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.161996 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.162017 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.162010 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.162165 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.162718 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.164632 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.164837 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.165111 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.165406 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.165760 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.168511 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.178479 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.180329 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.184902 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186054 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186098 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186176 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186192 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186193 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186205 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186242 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186256 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186270 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186279 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186283 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186314 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186329 4747 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186342 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186355 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186366 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186378 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186390 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186402 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186416 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186427 4747 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186439 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186453 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186464 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186475 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186488 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186499 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186510 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186522 4747 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186534 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186548 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186560 4747 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186572 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186584 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186594 4747 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186606 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186617 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186630 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186643 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186655 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186666 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186677 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186689 4747 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186701 4747 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186714 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186728 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186740 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186751 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186763 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186774 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186786 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186797 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186809 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186820 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186832 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186843 4747 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186854 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186865 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186877 4747 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186888 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186900 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186913 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186960 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186977 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.186988 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187000 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187011 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187023 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187036 4747 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187047 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187058 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187070 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187082 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187095 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187107 4747 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187118 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187131 4747 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187143 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187156 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187167 4747 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187180 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187191 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187203 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187215 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187226 4747 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187239 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187251 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187262 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187273 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187285 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187296 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187307 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187319 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187331 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187342 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187354 4747 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187370 4747 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187381 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187403 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187416 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187427 4747 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187438 4747 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187450 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187463 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187475 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187487 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187498 4747 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187510 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187522 4747 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187534 4747 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187545 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187555 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187568 4747 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187580 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187591 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187605 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187616 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187628 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187640 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187651 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187664 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187675 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187687 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187698 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187709 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187721 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187732 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187744 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187759 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187770 4747 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187783 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187794 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187806 4747 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187818 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187830 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187842 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187855 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187867 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187880 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187891 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187938 4747 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187951 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187962 4747 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187973 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187983 4747 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.187995 4747 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188006 4747 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188018 4747 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188030 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188042 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188053 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188064 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188075 4747 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188087 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188098 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188110 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188121 4747 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188133 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188145 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188156 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188169 4747 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188181 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188193 4747 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188204 4747 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188216 4747 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188228 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188240 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188252 4747 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188264 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188288 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188302 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188315 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188327 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188340 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188352 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188364 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188376 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188389 4747 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188401 4747 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.188412 4747 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.190315 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.191662 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.192057 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.225668 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.229424 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440"} Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.237964 4747 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.239610 4747 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.242550 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.253023 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.266824 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.274787 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.287291 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.289227 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.289255 4747 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.297449 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.308523 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.315377 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.315657 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.323657 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 18:46:22 crc kubenswrapper[4747]: W0930 18:46:22.330468 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5161d7d7fe5704078a356d4c9cbea047088950090839a66ebee95eda37512753 WatchSource:0}: Error finding container 5161d7d7fe5704078a356d4c9cbea047088950090839a66ebee95eda37512753: Status 404 returned error can't find the container with id 5161d7d7fe5704078a356d4c9cbea047088950090839a66ebee95eda37512753 Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.332163 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.694258 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.694346 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.694385 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694422 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:46:23.694399335 +0000 UTC m=+23.353879459 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.694450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.694482 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694530 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694550 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694556 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694564 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694607 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:23.694595981 +0000 UTC m=+23.354076095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694623 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:23.694615842 +0000 UTC m=+23.354095966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694625 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694695 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694740 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694755 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694753 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:23.694721865 +0000 UTC m=+23.354202009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:22 crc kubenswrapper[4747]: E0930 18:46:22.694897 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:23.694849449 +0000 UTC m=+23.354329623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.803496 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-pkmxs"] Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.804027 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.806647 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.807044 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.807217 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.807859 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.808302 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.812022 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.812445 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-v2fkl"] Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.812808 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v2fkl" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.814316 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rcwt4"] Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.815367 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-4zjq4"] Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.815766 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4zjq4" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.816153 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.817078 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.817135 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.817278 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.821819 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.822103 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.822352 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.822366 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.822656 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.822778 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.823039 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.830004 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.833453 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.839752 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.851685 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.862648 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.873041 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.900332 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fce119-955f-405b-bfb3-96aa4b34aef7-proxy-tls\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.900384 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd4ml\" (UniqueName: \"kubernetes.io/projected/a3fce119-955f-405b-bfb3-96aa4b34aef7-kube-api-access-bd4ml\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.900510 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3fce119-955f-405b-bfb3-96aa4b34aef7-mcd-auth-proxy-config\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.900570 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a3fce119-955f-405b-bfb3-96aa4b34aef7-rootfs\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.901146 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.924409 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.940639 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.950724 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.965267 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:22 crc kubenswrapper[4747]: I0930 18:46:22.987895 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-system-cni-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001197 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-hostroot\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001218 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/20d6dd78-38e3-4c23-9478-ba7779842d5b-hosts-file\") pod \"node-resolver-v2fkl\" (UID: \"20d6dd78-38e3-4c23-9478-ba7779842d5b\") " pod="openshift-dns/node-resolver-v2fkl" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001240 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-kubelet\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001256 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-multus-certs\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001280 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34f8698b-7682-4b27-99d0-d72fff30d5a8-cni-binary-copy\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001295 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-cni-multus\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001494 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-os-release\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001515 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-cni-bin\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001529 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj88c\" (UniqueName: \"kubernetes.io/projected/20d6dd78-38e3-4c23-9478-ba7779842d5b-kube-api-access-cj88c\") pod \"node-resolver-v2fkl\" (UID: \"20d6dd78-38e3-4c23-9478-ba7779842d5b\") " pod="openshift-dns/node-resolver-v2fkl" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001546 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-k8s-cni-cncf-io\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001560 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-conf-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001578 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-system-cni-dir\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001592 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-cni-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001606 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkqp4\" (UniqueName: \"kubernetes.io/projected/34f8698b-7682-4b27-99d0-d72fff30d5a8-kube-api-access-hkqp4\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001624 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a3fce119-955f-405b-bfb3-96aa4b34aef7-rootfs\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001661 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhw9r\" (UniqueName: \"kubernetes.io/projected/1ec942cb-ba9d-49cd-b746-b78c0b135bed-kube-api-access-vhw9r\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001678 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-netns\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001698 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fce119-955f-405b-bfb3-96aa4b34aef7-proxy-tls\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001722 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd4ml\" (UniqueName: \"kubernetes.io/projected/a3fce119-955f-405b-bfb3-96aa4b34aef7-kube-api-access-bd4ml\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001739 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001756 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3fce119-955f-405b-bfb3-96aa4b34aef7-mcd-auth-proxy-config\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001771 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-socket-dir-parent\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001794 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-daemon-config\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001815 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-os-release\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001830 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-cnibin\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001857 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001882 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-etc-kubernetes\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001903 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cnibin\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.001939 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.002072 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a3fce119-955f-405b-bfb3-96aa4b34aef7-rootfs\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.003395 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3fce119-955f-405b-bfb3-96aa4b34aef7-mcd-auth-proxy-config\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.014370 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fce119-955f-405b-bfb3-96aa4b34aef7-proxy-tls\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.016015 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.022126 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd4ml\" (UniqueName: \"kubernetes.io/projected/a3fce119-955f-405b-bfb3-96aa4b34aef7-kube-api-access-bd4ml\") pod \"machine-config-daemon-pkmxs\" (UID: \"a3fce119-955f-405b-bfb3-96aa4b34aef7\") " pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.039999 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.058640 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.067078 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.077585 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.086251 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.087184 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.087201 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.087331 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.087407 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.091709 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.092268 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.093094 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.093679 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.094271 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.094740 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.095325 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.095866 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.096220 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.096469 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.097005 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.097486 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.098116 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.098576 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.101304 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.101891 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102237 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-socket-dir-parent\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102268 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-daemon-config\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-os-release\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102312 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102330 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-cnibin\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102346 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cnibin\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102361 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102379 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-etc-kubernetes\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102395 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-system-cni-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102413 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-hostroot\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102429 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/20d6dd78-38e3-4c23-9478-ba7779842d5b-hosts-file\") pod \"node-resolver-v2fkl\" (UID: \"20d6dd78-38e3-4c23-9478-ba7779842d5b\") " pod="openshift-dns/node-resolver-v2fkl" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102449 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34f8698b-7682-4b27-99d0-d72fff30d5a8-cni-binary-copy\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102523 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cnibin\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102547 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-system-cni-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102563 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-socket-dir-parent\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102575 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-os-release\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102598 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/20d6dd78-38e3-4c23-9478-ba7779842d5b-hosts-file\") pod \"node-resolver-v2fkl\" (UID: \"20d6dd78-38e3-4c23-9478-ba7779842d5b\") " pod="openshift-dns/node-resolver-v2fkl" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102564 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-cnibin\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102627 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-hostroot\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102645 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-etc-kubernetes\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102467 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-cni-multus\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102684 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-kubelet\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102701 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-multus-certs\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102720 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-cni-multus\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102734 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-os-release\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102740 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-kubelet\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102753 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-multus-certs\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102772 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-cni-bin\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102805 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj88c\" (UniqueName: \"kubernetes.io/projected/20d6dd78-38e3-4c23-9478-ba7779842d5b-kube-api-access-cj88c\") pod \"node-resolver-v2fkl\" (UID: \"20d6dd78-38e3-4c23-9478-ba7779842d5b\") " pod="openshift-dns/node-resolver-v2fkl" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102812 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-os-release\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102825 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-k8s-cni-cncf-io\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102842 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-conf-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102848 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-k8s-cni-cncf-io\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102822 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-var-lib-cni-bin\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102871 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-system-cni-dir\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102889 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-cni-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102907 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-system-cni-dir\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102909 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhw9r\" (UniqueName: \"kubernetes.io/projected/1ec942cb-ba9d-49cd-b746-b78c0b135bed-kube-api-access-vhw9r\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkqp4\" (UniqueName: \"kubernetes.io/projected/34f8698b-7682-4b27-99d0-d72fff30d5a8-kube-api-access-hkqp4\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102991 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-netns\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103008 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103026 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ec942cb-ba9d-49cd-b746-b78c0b135bed-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103058 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-cni-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.102888 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-conf-dir\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103119 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34f8698b-7682-4b27-99d0-d72fff30d5a8-host-run-netns\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103140 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/34f8698b-7682-4b27-99d0-d72fff30d5a8-multus-daemon-config\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103315 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34f8698b-7682-4b27-99d0-d72fff30d5a8-cni-binary-copy\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103352 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103395 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.103563 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ec942cb-ba9d-49cd-b746-b78c0b135bed-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.104138 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.104555 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.105146 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.105723 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.106191 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.106743 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.108570 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.109200 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.110001 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.110574 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.111560 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.112056 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.113506 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.114073 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.114508 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.114546 4747 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.114639 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.116757 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.117509 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.117965 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.120383 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.121090 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.122009 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.122646 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.123697 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.124253 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.125239 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.125348 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.129883 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj88c\" (UniqueName: \"kubernetes.io/projected/20d6dd78-38e3-4c23-9478-ba7779842d5b-kube-api-access-cj88c\") pod \"node-resolver-v2fkl\" (UID: \"20d6dd78-38e3-4c23-9478-ba7779842d5b\") " pod="openshift-dns/node-resolver-v2fkl" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.130137 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.130765 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.131703 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.132342 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.132732 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v2fkl" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.133401 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.133794 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.134164 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhw9r\" (UniqueName: \"kubernetes.io/projected/1ec942cb-ba9d-49cd-b746-b78c0b135bed-kube-api-access-vhw9r\") pod \"multus-additional-cni-plugins-rcwt4\" (UID: \"1ec942cb-ba9d-49cd-b746-b78c0b135bed\") " pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.134388 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.134865 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.136029 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.136549 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.145257 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkqp4\" (UniqueName: \"kubernetes.io/projected/34f8698b-7682-4b27-99d0-d72fff30d5a8-kube-api-access-hkqp4\") pod \"multus-4zjq4\" (UID: \"34f8698b-7682-4b27-99d0-d72fff30d5a8\") " pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.145984 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.146080 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4zjq4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.146130 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.146902 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.147753 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.148129 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.155938 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.168333 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.186330 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pnqjs"] Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.187577 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.191297 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.191540 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.191671 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.191849 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.192027 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.192980 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.199182 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.209308 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.230322 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.237428 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d00dd81c07b66e744db599447b17819532d78d8f72193e52cd2ee0c21955c849"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.242387 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.244097 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.244156 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.244172 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5161d7d7fe5704078a356d4c9cbea047088950090839a66ebee95eda37512753"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.247366 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v2fkl" event={"ID":"20d6dd78-38e3-4c23-9478-ba7779842d5b","Type":"ContainerStarted","Data":"0503080fe2045086492df4665e6a3b7c5d0a061304219d3a2fe1396156d1a651"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.250270 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.250296 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"33bc8527a3cdc978ad74a06560558b5e0ad71b86de6db9fdd5703b6f68086687"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.251265 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zjq4" event={"ID":"34f8698b-7682-4b27-99d0-d72fff30d5a8","Type":"ContainerStarted","Data":"b0cef3da1f6645d4c8711c6f5255b5875a6485010f0f52ed777d4bdb367cbb60"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.252838 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" event={"ID":"1ec942cb-ba9d-49cd-b746-b78c0b135bed","Type":"ContainerStarted","Data":"33f670a05a17e1c60f690cbdc06218a45fcf262bfa4de2c7dd6d808989f4557b"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.254916 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"c2f849101212bbea2336f5e7f39a93a3b1b5f2935f3ba0a12364386b82ab13cd"} Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.255670 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.257022 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.261587 4747 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.266045 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.277663 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.294501 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304347 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-var-lib-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304382 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304449 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovn-node-metrics-cert\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304467 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-kubelet\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304547 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-config\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304651 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-netd\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304715 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-env-overrides\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304765 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-slash\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304804 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-ovn\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304848 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304886 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-node-log\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304906 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-systemd-units\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304943 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-bin\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304979 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-script-lib\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.304996 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bsls\" (UniqueName: \"kubernetes.io/projected/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-kube-api-access-9bsls\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.305011 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-systemd\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.305026 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-netns\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.305119 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-etc-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.305156 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-log-socket\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.305223 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.305498 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.316303 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.332965 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.344270 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.367104 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.386839 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.403650 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.407389 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-slash\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.407670 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-slash\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.407725 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-ovn\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.407827 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-ovn\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.407889 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.407996 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408025 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-node-log\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408070 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-systemd-units\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408090 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-bin\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408225 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-script-lib\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408130 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-node-log\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408177 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-bin\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408154 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-systemd-units\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408489 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-systemd\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408543 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-systemd\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408601 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bsls\" (UniqueName: \"kubernetes.io/projected/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-kube-api-access-9bsls\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408749 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-netns\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408798 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-script-lib\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.408634 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-netns\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409050 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-etc-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409124 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-etc-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409193 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-log-socket\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409423 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-log-socket\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409470 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409494 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409552 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409677 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409726 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-var-lib-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409761 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-kubelet\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409785 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovn-node-metrics-cert\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409815 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-netd\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409836 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-config\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.409869 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-env-overrides\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.410365 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-env-overrides\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.410423 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-var-lib-openvswitch\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.411392 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-kubelet\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.412246 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-netd\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.412966 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-config\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.415697 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovn-node-metrics-cert\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.419277 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.423794 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bsls\" (UniqueName: \"kubernetes.io/projected/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-kube-api-access-9bsls\") pod \"ovnkube-node-pnqjs\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.436994 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.454183 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.466473 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.480075 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.492009 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.508330 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.514026 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.530780 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.546970 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.565913 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: W0930 18:46:23.580611 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5851f3a5_36f6_4e85_8584_5ce70fda9d7d.slice/crio-1b1a9c722e051ed0fa5548647209c10e8877d9b139e5458e769b6a9e75e96baf WatchSource:0}: Error finding container 1b1a9c722e051ed0fa5548647209c10e8877d9b139e5458e769b6a9e75e96baf: Status 404 returned error can't find the container with id 1b1a9c722e051ed0fa5548647209c10e8877d9b139e5458e769b6a9e75e96baf Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.596289 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.639755 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.674055 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.713357 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.713454 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.713492 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.713517 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713637 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713657 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713670 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713685 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713717 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:25.713701936 +0000 UTC m=+25.373182060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713713 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713753 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713769 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713773 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:25.713751348 +0000 UTC m=+25.373231462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.713859 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:25.713839101 +0000 UTC m=+25.373319225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.713968 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.714028 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:46:25.714017927 +0000 UTC m=+25.373498061 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.714049 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:23 crc kubenswrapper[4747]: E0930 18:46:23.714096 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:25.714087329 +0000 UTC m=+25.373567453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:23 crc kubenswrapper[4747]: I0930 18:46:23.724541 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:23Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.086505 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:24 crc kubenswrapper[4747]: E0930 18:46:24.086681 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.258533 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zjq4" event={"ID":"34f8698b-7682-4b27-99d0-d72fff30d5a8","Type":"ContainerStarted","Data":"0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c"} Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.261008 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v2fkl" event={"ID":"20d6dd78-38e3-4c23-9478-ba7779842d5b","Type":"ContainerStarted","Data":"e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141"} Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.262535 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ec942cb-ba9d-49cd-b746-b78c0b135bed" containerID="9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd" exitCode=0 Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.262601 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" event={"ID":"1ec942cb-ba9d-49cd-b746-b78c0b135bed","Type":"ContainerDied","Data":"9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd"} Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.265540 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0"} Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.265574 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9"} Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.266845 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3" exitCode=0 Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.266890 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3"} Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.267003 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"1b1a9c722e051ed0fa5548647209c10e8877d9b139e5458e769b6a9e75e96baf"} Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.284177 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.314375 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.334266 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.360810 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.372471 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.384917 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.399271 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.415124 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.430607 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.453139 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.470302 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.484985 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.499318 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.513345 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.536242 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.550604 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.568036 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.579384 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.593287 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.608451 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.623695 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.641456 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.653293 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.678319 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.726352 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.753917 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.791165 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:24 crc kubenswrapper[4747]: I0930 18:46:24.837092 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:24Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.088302 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.088556 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.089744 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.089900 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.274967 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ec942cb-ba9d-49cd-b746-b78c0b135bed" containerID="a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2" exitCode=0 Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.275121 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" event={"ID":"1ec942cb-ba9d-49cd-b746-b78c0b135bed","Type":"ContainerDied","Data":"a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2"} Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.278522 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e"} Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.284688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493"} Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.284746 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560"} Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.284766 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8"} Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.284783 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23"} Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.284801 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914"} Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.295175 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.313980 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.331548 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.370483 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.432492 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.455188 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.468990 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.480763 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.498399 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.519716 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.531353 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.545541 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.559962 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.575694 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.589675 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.606153 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.622815 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.640402 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.665246 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.680425 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.695265 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.719206 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.736096 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.736305 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.736377 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.736442 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.736495 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.736659 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.736859 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:29.736721495 +0000 UTC m=+29.396201629 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737171 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737212 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737254 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737334 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737352 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:29.737316045 +0000 UTC m=+29.396796199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737401 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:29.737381867 +0000 UTC m=+29.396862001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737506 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:46:29.73748997 +0000 UTC m=+29.396970114 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737546 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737571 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737588 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:25 crc kubenswrapper[4747]: E0930 18:46:25.737640 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:29.737626755 +0000 UTC m=+29.397106939 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.756527 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.794987 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.837841 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.880219 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.923459 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:25 crc kubenswrapper[4747]: I0930 18:46:25.956274 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:25Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.086472 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:26 crc kubenswrapper[4747]: E0930 18:46:26.086701 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.213666 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sdgzs"] Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.214344 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.220172 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.220226 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.220442 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.222685 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.239854 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.257345 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.276870 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.292793 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.296381 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d"} Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.300598 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ec942cb-ba9d-49cd-b746-b78c0b135bed" containerID="58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc" exitCode=0 Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.300719 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" event={"ID":"1ec942cb-ba9d-49cd-b746-b78c0b135bed","Type":"ContainerDied","Data":"58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc"} Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.310828 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.340033 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.341219 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-host\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.341258 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6bl\" (UniqueName: \"kubernetes.io/projected/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-kube-api-access-qk6bl\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.341280 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-serviceca\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.356684 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.369909 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.392243 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.433731 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.442114 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6bl\" (UniqueName: \"kubernetes.io/projected/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-kube-api-access-qk6bl\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.442167 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-serviceca\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.442276 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-host\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.442896 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-host\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.444991 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-serviceca\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.487381 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6bl\" (UniqueName: \"kubernetes.io/projected/83fbf7d9-81f5-4311-8619-3f0acd2c7fab-kube-api-access-qk6bl\") pod \"node-ca-sdgzs\" (UID: \"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\") " pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.498000 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.536205 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sdgzs" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.536192 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: W0930 18:46:26.556145 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83fbf7d9_81f5_4311_8619_3f0acd2c7fab.slice/crio-9aa297a7a9941beda8e32303775b4300f7ec990c9b2ce74ebfdf0fefe91ba926 WatchSource:0}: Error finding container 9aa297a7a9941beda8e32303775b4300f7ec990c9b2ce74ebfdf0fefe91ba926: Status 404 returned error can't find the container with id 9aa297a7a9941beda8e32303775b4300f7ec990c9b2ce74ebfdf0fefe91ba926 Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.580153 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.630131 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.656564 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.697432 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.732438 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.771970 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.811662 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.860707 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.894191 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.934604 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:26 crc kubenswrapper[4747]: I0930 18:46:26.979865 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:26Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.019031 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.058420 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.086295 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.086454 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:27 crc kubenswrapper[4747]: E0930 18:46:27.086780 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:27 crc kubenswrapper[4747]: E0930 18:46:27.087021 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.101539 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.138323 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.179744 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.222712 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.271835 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.306597 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sdgzs" event={"ID":"83fbf7d9-81f5-4311-8619-3f0acd2c7fab","Type":"ContainerStarted","Data":"15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.306691 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sdgzs" event={"ID":"83fbf7d9-81f5-4311-8619-3f0acd2c7fab","Type":"ContainerStarted","Data":"9aa297a7a9941beda8e32303775b4300f7ec990c9b2ce74ebfdf0fefe91ba926"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.315364 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ec942cb-ba9d-49cd-b746-b78c0b135bed" containerID="998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8" exitCode=0 Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.315454 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" event={"ID":"1ec942cb-ba9d-49cd-b746-b78c0b135bed","Type":"ContainerDied","Data":"998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.332825 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.352061 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.381136 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.415957 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.459325 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.466150 4747 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.469213 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.469299 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.469328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.469562 4747 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.529756 4747 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.530082 4747 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.531448 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.531486 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.531494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.531511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.531524 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.533359 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: E0930 18:46:27.552545 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.559630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.559682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.559700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.559724 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.559742 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: E0930 18:46:27.577723 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.581739 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.582964 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.582995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.583012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.583035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.583083 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: E0930 18:46:27.600726 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.605794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.605849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.605868 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.605894 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.605911 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.613378 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: E0930 18:46:27.623911 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.628463 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.628515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.628539 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.628564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.628584 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: E0930 18:46:27.640659 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: E0930 18:46:27.640780 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.643090 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.643127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.643138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.643155 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.643167 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.657248 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.694010 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.738548 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.748303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.748370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.748389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.748417 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.748436 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.774949 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.822544 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.852529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.852594 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.852616 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.852642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.852659 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.870593 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.899452 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.952482 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.957569 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.957626 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.957652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.957683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.957706 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:27Z","lastTransitionTime":"2025-09-30T18:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:27 crc kubenswrapper[4747]: I0930 18:46:27.979092 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:27Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.021217 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.061231 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.061297 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.061404 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.061423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.061446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.061464 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.087169 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:28 crc kubenswrapper[4747]: E0930 18:46:28.087391 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.100757 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.141921 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.164904 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.165012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.165038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.165070 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.165087 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.182719 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.218069 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.259987 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.267663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.267731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.267749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.267776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.267794 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.301367 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.325508 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ec942cb-ba9d-49cd-b746-b78c0b135bed" containerID="cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8" exitCode=0 Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.325584 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" event={"ID":"1ec942cb-ba9d-49cd-b746-b78c0b135bed","Type":"ContainerDied","Data":"cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.332912 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.351663 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.371278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.371394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.371546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.371579 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.371599 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.375974 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.418831 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.456184 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.475638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.475688 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.475699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.475715 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.475728 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.497201 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.531046 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.579895 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.580058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.580073 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.580082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.580094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.580104 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.618734 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.672512 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.683500 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.683557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.683579 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.683606 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.683629 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.705998 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.735412 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.786539 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.786581 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.786597 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.786618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.786636 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.788858 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.818376 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.859768 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.889867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.889899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.889910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.889948 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.889961 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.894483 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.936252 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.975204 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:28Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.992165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.992193 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.992201 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.992213 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:28 crc kubenswrapper[4747]: I0930 18:46:28.992223 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:28Z","lastTransitionTime":"2025-09-30T18:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.014221 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.067162 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.086641 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.086806 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.086944 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.087155 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.093874 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.093912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.093948 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.093968 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.093985 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.100683 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.197058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.197134 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.197157 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.197182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.197201 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.301514 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.301577 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.301600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.301630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.301653 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.343779 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ec942cb-ba9d-49cd-b746-b78c0b135bed" containerID="4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc" exitCode=0 Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.343840 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" event={"ID":"1ec942cb-ba9d-49cd-b746-b78c0b135bed","Type":"ContainerDied","Data":"4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.380167 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.400668 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.405634 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.405725 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.405764 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.405786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.405800 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.426006 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.455164 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.478742 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.503431 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.509285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.509348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.509365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.509393 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.509412 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.523359 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.543547 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.563238 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.578856 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.601634 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.613310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.613364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.613383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.613407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.613424 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.618537 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.633815 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.659724 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.696173 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.716663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.716718 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.716735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.716762 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.716780 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.777920 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.778086 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.778145 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.778191 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778225 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:46:37.778188044 +0000 UTC m=+37.437668188 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778381 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778399 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778520 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778537 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778549 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778600 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.778611 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778579 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778503 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:37.778475053 +0000 UTC m=+37.437955207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778721 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:37.778705061 +0000 UTC m=+37.438185205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778742 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:37.778730782 +0000 UTC m=+37.438210936 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778757 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:29 crc kubenswrapper[4747]: E0930 18:46:29.778848 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:37.778821535 +0000 UTC m=+37.438301689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.819742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.819783 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.819794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.819813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.819826 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.923161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.923639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.923656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.923676 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:29 crc kubenswrapper[4747]: I0930 18:46:29.923689 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:29Z","lastTransitionTime":"2025-09-30T18:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.026638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.026719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.026741 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.026770 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.026803 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.086146 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:30 crc kubenswrapper[4747]: E0930 18:46:30.086395 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.130789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.130854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.130872 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.130901 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.130920 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.233742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.233799 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.233816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.233842 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.233862 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.337113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.337175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.337192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.337219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.337237 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.354490 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" event={"ID":"1ec942cb-ba9d-49cd-b746-b78c0b135bed","Type":"ContainerStarted","Data":"e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.362026 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.362494 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.362560 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.375583 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.394081 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.418085 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.419590 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.422285 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.497642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.497687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.497704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.497726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.497743 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.500256 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.521193 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.540338 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.556052 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.582965 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.598899 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.608695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.608766 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.608790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.608817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.608838 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.618107 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.650800 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.669807 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.688987 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.705489 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.711751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.712068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.712312 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.712541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.712732 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.718567 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.729839 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.745841 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.766811 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.785703 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.802181 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.815319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.815382 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.815403 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.815431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.815451 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.833638 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.853264 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.868012 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.887239 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.897620 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.914208 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.918306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.918339 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.918353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.918369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.918380 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:30Z","lastTransitionTime":"2025-09-30T18:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.925729 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.937764 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.948890 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:30 crc kubenswrapper[4747]: I0930 18:46:30.961655 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:30Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.021775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.021827 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.021846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.021870 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.021889 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.086625 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.086651 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:31 crc kubenswrapper[4747]: E0930 18:46:31.086831 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:31 crc kubenswrapper[4747]: E0930 18:46:31.087004 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.106743 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.125212 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.125261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.125279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.125300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.125318 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.139466 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.155631 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.180616 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.201753 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.226291 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.229775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.230333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.231217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.231270 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.231293 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.240620 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.255094 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.274556 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.282169 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.306738 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.333313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.333356 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.333366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.333382 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.333400 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.335829 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.366534 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.379132 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.413431 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.435848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.436268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.436406 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.436552 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.436855 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.454744 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.493340 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.540009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.540106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.540145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.540179 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.540207 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.544466 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.573882 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.618808 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.643221 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.643280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.643291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.643313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.643327 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.660297 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.697236 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.746786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.746842 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.746857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.746884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.746902 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.748618 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.777798 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.820205 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.850228 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.850265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.850277 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.850296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.850309 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.857710 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.900460 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.940035 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.952835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.952870 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.952888 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.952912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.952959 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:31Z","lastTransitionTime":"2025-09-30T18:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:31 crc kubenswrapper[4747]: I0930 18:46:31.978636 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.017603 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:32Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.063191 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:32Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.066566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.066809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.066997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.067180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.067319 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.086155 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:32 crc kubenswrapper[4747]: E0930 18:46:32.086247 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.091055 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:32Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.171115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.171179 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.171194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.171214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.171231 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.273712 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.273748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.273760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.273777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.273790 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.370260 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.376960 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.376992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.377003 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.377015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.377024 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.479885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.479922 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.479958 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.479977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.479989 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.582426 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.582479 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.582494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.582515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.582531 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.685301 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.685347 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.685357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.685374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.685391 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.788704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.788775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.788788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.788809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.788820 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.892391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.892471 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.892495 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.892538 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.892566 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.995454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.995496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.995511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.995527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:32 crc kubenswrapper[4747]: I0930 18:46:32.995540 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:32Z","lastTransitionTime":"2025-09-30T18:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.086637 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.086684 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:33 crc kubenswrapper[4747]: E0930 18:46:33.086883 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:33 crc kubenswrapper[4747]: E0930 18:46:33.087103 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.097964 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.098044 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.098068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.098091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.098115 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.201754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.201831 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.201851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.201877 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.201896 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.304848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.304902 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.304920 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.304975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.304998 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.377410 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/0.log" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.382903 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa" exitCode=1 Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.382980 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.384007 4747 scope.go:117] "RemoveContainer" containerID="eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.408226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.408300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.408321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.408344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.408362 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.430646 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.451420 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.469341 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.486750 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.505378 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.519728 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.519791 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.519813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.519844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.519867 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.527992 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.544536 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.565541 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.589643 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.617699 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.622148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.622172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.622180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.622192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.622201 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.641481 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:33Z\\\",\\\"message\\\":\\\" 18:46:33.081808 6069 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:46:33.081843 6069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:46:33.081854 6069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:46:33.081852 6069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 18:46:33.081884 6069 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 18:46:33.081892 6069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:46:33.081905 6069 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:46:33.081909 6069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:46:33.081956 6069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 18:46:33.081991 6069 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 18:46:33.082027 6069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:46:33.082041 6069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:46:33.082090 6069 factory.go:656] Stopping watch factory\\\\nI0930 18:46:33.082111 6069 ovnkube.go:599] Stopped ovnkube\\\\nI0930 18:46:33.082143 6069 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 18:46:33.082159 6069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.657342 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.668861 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.682403 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.692427 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:33Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.724266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.724302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.724312 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.724324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.724334 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.827233 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.827284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.827296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.827312 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.827325 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.930438 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.930496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.930511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.930534 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:33 crc kubenswrapper[4747]: I0930 18:46:33.930552 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:33Z","lastTransitionTime":"2025-09-30T18:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.032896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.032934 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.032942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.032954 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.032963 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.113766 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:34 crc kubenswrapper[4747]: E0930 18:46:34.113886 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.114197 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:34 crc kubenswrapper[4747]: E0930 18:46:34.114247 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.134815 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.134852 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.134862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.134879 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.134892 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.237192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.237240 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.237255 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.237276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.237291 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.339870 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.339952 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.339971 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.339997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.340015 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.388269 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/0.log" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.390742 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.390876 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.408352 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.420173 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.432122 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.444889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.444947 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.444959 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.444977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.444993 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.447088 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.459949 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.475270 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.486025 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.500905 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.522444 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:33Z\\\",\\\"message\\\":\\\" 18:46:33.081808 6069 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:46:33.081843 6069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:46:33.081854 6069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:46:33.081852 6069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 18:46:33.081884 6069 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 18:46:33.081892 6069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:46:33.081905 6069 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:46:33.081909 6069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:46:33.081956 6069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 18:46:33.081991 6069 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 18:46:33.082027 6069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:46:33.082041 6069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:46:33.082090 6069 factory.go:656] Stopping watch factory\\\\nI0930 18:46:33.082111 6069 ovnkube.go:599] Stopped ovnkube\\\\nI0930 18:46:33.082143 6069 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 18:46:33.082159 6069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.536290 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.547597 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.547648 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.547664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.547687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.547704 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.548862 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.563192 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.577407 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.609555 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.622227 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:34Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.650911 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.650965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.650974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.650989 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.650999 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.754774 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.754844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.754861 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.754885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.754905 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.857969 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.858023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.858034 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.858075 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.858089 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.961287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.961330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.961364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.961381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:34 crc kubenswrapper[4747]: I0930 18:46:34.961395 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:34Z","lastTransitionTime":"2025-09-30T18:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.064155 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.064214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.064234 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.064259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.064276 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.086999 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:35 crc kubenswrapper[4747]: E0930 18:46:35.087171 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.167308 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.167377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.167401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.167430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.167452 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.270232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.270281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.270292 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.270309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.270321 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.373324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.373378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.373391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.373411 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.373424 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.398131 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/1.log" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.399128 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/0.log" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.403461 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53" exitCode=1 Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.403537 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.403713 4747 scope.go:117] "RemoveContainer" containerID="eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.404696 4747 scope.go:117] "RemoveContainer" containerID="625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53" Sep 30 18:46:35 crc kubenswrapper[4747]: E0930 18:46:35.404955 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.435917 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.459451 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.476386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.476436 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.476446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.476466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.476480 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.480754 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.497193 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.515278 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.531661 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.563395 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.579249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.579308 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.579326 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.579733 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.579790 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.590863 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:33Z\\\",\\\"message\\\":\\\" 18:46:33.081808 6069 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:46:33.081843 6069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:46:33.081854 6069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:46:33.081852 6069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 18:46:33.081884 6069 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 18:46:33.081892 6069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:46:33.081905 6069 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:46:33.081909 6069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:46:33.081956 6069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 18:46:33.081991 6069 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 18:46:33.082027 6069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:46:33.082041 6069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:46:33.082090 6069 factory.go:656] Stopping watch factory\\\\nI0930 18:46:33.082111 6069 ovnkube.go:599] Stopped ovnkube\\\\nI0930 18:46:33.082143 6069 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 18:46:33.082159 6069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:34Z\\\",\\\"message\\\":\\\" column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 18:46:34.489550 6188 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nI0930 18:46:34.489547 6188 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:34.489593 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.604717 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.618409 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.630722 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.648726 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.661348 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.676434 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.682211 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.682242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.682258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.682279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.682295 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.690425 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.785680 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.785745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.785763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.785788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.785805 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.813984 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t"] Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.814485 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.818340 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.818394 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.832949 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.847601 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.857645 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.857744 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.857796 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.857949 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzh8f\" (UniqueName: \"kubernetes.io/projected/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-kube-api-access-qzh8f\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.866674 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.879759 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.889042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.889091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.889104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.889119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.889132 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.906313 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.918076 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.930640 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.947038 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.959301 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzh8f\" (UniqueName: \"kubernetes.io/projected/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-kube-api-access-qzh8f\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.959404 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.959445 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.959484 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.960139 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.960309 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.964595 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.967698 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.979439 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.988068 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzh8f\" (UniqueName: \"kubernetes.io/projected/f0a54fa2-898d-46ef-bb62-be104bf8c2fe-kube-api-access-qzh8f\") pod \"ovnkube-control-plane-749d76644c-8r68t\" (UID: \"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.991837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.992028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.992123 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.992215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.992287 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:35Z","lastTransitionTime":"2025-09-30T18:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:35 crc kubenswrapper[4747]: I0930 18:46:35.996700 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:35Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.009411 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:36Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.024263 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:36Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.034998 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:36Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.053611 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:36Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.074543 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:33Z\\\",\\\"message\\\":\\\" 18:46:33.081808 6069 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:46:33.081843 6069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:46:33.081854 6069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:46:33.081852 6069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 18:46:33.081884 6069 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 18:46:33.081892 6069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:46:33.081905 6069 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:46:33.081909 6069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:46:33.081956 6069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 18:46:33.081991 6069 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 18:46:33.082027 6069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:46:33.082041 6069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:46:33.082090 6069 factory.go:656] Stopping watch factory\\\\nI0930 18:46:33.082111 6069 ovnkube.go:599] Stopped ovnkube\\\\nI0930 18:46:33.082143 6069 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 18:46:33.082159 6069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:34Z\\\",\\\"message\\\":\\\" column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 18:46:34.489550 6188 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nI0930 18:46:34.489547 6188 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:34.489593 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:36Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.086531 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:36 crc kubenswrapper[4747]: E0930 18:46:36.086670 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.086545 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:36 crc kubenswrapper[4747]: E0930 18:46:36.086755 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.094435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.094548 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.094656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.094816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.094902 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.134323 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" Sep 30 18:46:36 crc kubenswrapper[4747]: W0930 18:46:36.152997 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0a54fa2_898d_46ef_bb62_be104bf8c2fe.slice/crio-42b704dfa588a7b2ccf49bbe6953770829c02da91066a7553642fd56ad326d84 WatchSource:0}: Error finding container 42b704dfa588a7b2ccf49bbe6953770829c02da91066a7553642fd56ad326d84: Status 404 returned error can't find the container with id 42b704dfa588a7b2ccf49bbe6953770829c02da91066a7553642fd56ad326d84 Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.198306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.198359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.198376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.198407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.198426 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.300983 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.301025 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.301035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.301050 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.301060 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.402885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.402955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.402975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.403001 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.403019 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.409535 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/1.log" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.423598 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" event={"ID":"f0a54fa2-898d-46ef-bb62-be104bf8c2fe","Type":"ContainerStarted","Data":"c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.423665 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" event={"ID":"f0a54fa2-898d-46ef-bb62-be104bf8c2fe","Type":"ContainerStarted","Data":"42b704dfa588a7b2ccf49bbe6953770829c02da91066a7553642fd56ad326d84"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.506489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.506557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.506577 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.506603 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.506620 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.609461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.609493 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.609504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.609520 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.609532 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.712616 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.712981 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.713212 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.713802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.713903 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.816371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.816631 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.816727 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.816804 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.816873 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.919757 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.919814 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.919830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.919855 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:36 crc kubenswrapper[4747]: I0930 18:46:36.919875 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:36Z","lastTransitionTime":"2025-09-30T18:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.022859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.022962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.022988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.023019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.023040 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.087199 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.087509 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.125462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.125527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.125545 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.125571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.125590 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.229077 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.229484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.229683 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.230015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.230326 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.331887 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fbzb6"] Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.333268 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.333609 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.333631 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.334056 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.334269 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.334469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.334663 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.355272 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.376323 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.376834 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6djlx\" (UniqueName: \"kubernetes.io/projected/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-kube-api-access-6djlx\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.377276 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.400181 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.416040 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.428790 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" event={"ID":"f0a54fa2-898d-46ef-bb62-be104bf8c2fe","Type":"ContainerStarted","Data":"31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.437245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.437317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.437340 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.437371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.437395 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.447542 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:33Z\\\",\\\"message\\\":\\\" 18:46:33.081808 6069 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:46:33.081843 6069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:46:33.081854 6069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:46:33.081852 6069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 18:46:33.081884 6069 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 18:46:33.081892 6069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:46:33.081905 6069 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:46:33.081909 6069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:46:33.081956 6069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 18:46:33.081991 6069 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 18:46:33.082027 6069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:46:33.082041 6069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:46:33.082090 6069 factory.go:656] Stopping watch factory\\\\nI0930 18:46:33.082111 6069 ovnkube.go:599] Stopped ovnkube\\\\nI0930 18:46:33.082143 6069 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 18:46:33.082159 6069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:34Z\\\",\\\"message\\\":\\\" column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 18:46:34.489550 6188 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nI0930 18:46:34.489547 6188 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:34.489593 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.461441 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.477844 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.477965 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6djlx\" (UniqueName: \"kubernetes.io/projected/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-kube-api-access-6djlx\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.478055 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.479323 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.479500 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs podName:5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:37.979438267 +0000 UTC m=+37.638918431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs") pod "network-metrics-daemon-fbzb6" (UID: "5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.500857 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.511058 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6djlx\" (UniqueName: \"kubernetes.io/projected/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-kube-api-access-6djlx\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.517586 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.536004 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.540528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.540571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.540583 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.540600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.540612 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.556352 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.572961 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.592453 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.608072 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.637736 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.643415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.643643 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.643706 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.643898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.644108 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.655253 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.673572 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.698420 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.719132 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.741308 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.747055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.747119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.747144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.747176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.747199 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.761460 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.780694 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.780800 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.780855 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.780896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781028 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781039 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:46:53.780990449 +0000 UTC m=+53.440470613 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781109 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781179 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781214 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781236 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:53.781219947 +0000 UTC m=+53.440700091 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781253 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781292 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.781204 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781324 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781301 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:53.781275328 +0000 UTC m=+53.440755502 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781344 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781369 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:53.781350821 +0000 UTC m=+53.440830965 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.781479 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:53.781458374 +0000 UTC m=+53.440938668 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.793154 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.811234 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.827539 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.850406 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.851345 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.851402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.851420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.851447 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.851465 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.865317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.865382 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.865407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.865440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.865464 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.872583 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.890343 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.895029 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.896247 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.896290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.896312 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.896336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.896355 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.913320 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.918013 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.923386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.923449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.923467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.923494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.923512 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.937037 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.946619 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.955773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.955885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.955906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.955962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.955981 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.961750 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.977608 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.982466 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.982955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.983100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.983185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.983290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.983377 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:37Z","lastTransitionTime":"2025-09-30T18:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:37 crc kubenswrapper[4747]: I0930 18:46:37.983197 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.983314 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.983839 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs podName:5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:38.983817436 +0000 UTC m=+38.643297580 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs") pod "network-metrics-daemon-fbzb6" (UID: "5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.999465 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:37Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:37 crc kubenswrapper[4747]: E0930 18:46:37.999866 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.002390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.002556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.002682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.002795 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.002876 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.003697 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:38Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.023137 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:38Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.045773 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:33Z\\\",\\\"message\\\":\\\" 18:46:33.081808 6069 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:46:33.081843 6069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:46:33.081854 6069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:46:33.081852 6069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 18:46:33.081884 6069 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 18:46:33.081892 6069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:46:33.081905 6069 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:46:33.081909 6069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:46:33.081956 6069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 18:46:33.081991 6069 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 18:46:33.082027 6069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:46:33.082041 6069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:46:33.082090 6069 factory.go:656] Stopping watch factory\\\\nI0930 18:46:33.082111 6069 ovnkube.go:599] Stopped ovnkube\\\\nI0930 18:46:33.082143 6069 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 18:46:33.082159 6069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:34Z\\\",\\\"message\\\":\\\" column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 18:46:34.489550 6188 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nI0930 18:46:34.489547 6188 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:34.489593 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:38Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.086220 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.086220 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:38 crc kubenswrapper[4747]: E0930 18:46:38.086408 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:38 crc kubenswrapper[4747]: E0930 18:46:38.086511 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.105310 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.105364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.105384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.105406 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.105430 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.208901 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.209082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.209107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.209138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.209159 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.312317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.312713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.312972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.313303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.313674 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.417324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.417387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.417411 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.417437 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.417455 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.519768 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.520727 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.520919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.521314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.521642 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.625484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.625789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.625918 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.626107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.626225 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.728992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.729401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.729802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.730268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.730505 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.832904 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.833454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.833651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.834021 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.834413 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.937417 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.937519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.937544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.937574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.937597 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:38Z","lastTransitionTime":"2025-09-30T18:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:38 crc kubenswrapper[4747]: I0930 18:46:38.995779 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:38 crc kubenswrapper[4747]: E0930 18:46:38.995965 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:38 crc kubenswrapper[4747]: E0930 18:46:38.996054 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs podName:5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:40.996031399 +0000 UTC m=+40.655511553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs") pod "network-metrics-daemon-fbzb6" (UID: "5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.041191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.041303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.041327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.041350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.041367 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.086728 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.086827 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:39 crc kubenswrapper[4747]: E0930 18:46:39.087232 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:39 crc kubenswrapper[4747]: E0930 18:46:39.087416 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.148636 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.148721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.148750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.148817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.148860 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.253022 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.253122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.253149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.253181 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.253274 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.356615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.357071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.357214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.357427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.357626 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.460526 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.460621 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.460644 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.460671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.460693 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.563421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.563568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.563589 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.563613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.563660 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.667484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.667556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.667579 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.667647 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.667670 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.770751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.770805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.770822 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.770845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.770862 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.873721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.874141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.874316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.874566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.874752 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.977952 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.978018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.978035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.978061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:39 crc kubenswrapper[4747]: I0930 18:46:39.978080 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:39Z","lastTransitionTime":"2025-09-30T18:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.081244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.081305 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.081323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.081351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.081369 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.086651 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.086653 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:40 crc kubenswrapper[4747]: E0930 18:46:40.086992 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:40 crc kubenswrapper[4747]: E0930 18:46:40.086805 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.184693 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.184747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.184763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.184788 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.184804 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.289124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.289235 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.289264 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.289294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.289316 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.392717 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.393160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.393357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.393575 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.394042 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.497511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.497599 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.497618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.497642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.497664 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.600116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.601103 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.601154 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.601185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.601204 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.704799 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.704866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.704883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.704910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.704999 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.808704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.808766 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.808785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.808808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.808826 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.911558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.911611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.911629 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.911654 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:40 crc kubenswrapper[4747]: I0930 18:46:40.911671 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:40Z","lastTransitionTime":"2025-09-30T18:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.015266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.015323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.015341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.015365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.015383 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.015901 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:41 crc kubenswrapper[4747]: E0930 18:46:41.016161 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:41 crc kubenswrapper[4747]: E0930 18:46:41.016281 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs podName:5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:45.016252987 +0000 UTC m=+44.675733131 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs") pod "network-metrics-daemon-fbzb6" (UID: "5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.087116 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.087148 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:41 crc kubenswrapper[4747]: E0930 18:46:41.087314 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:41 crc kubenswrapper[4747]: E0930 18:46:41.087895 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.103862 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.118591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.118640 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.118660 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.118685 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.118550 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.118703 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.135291 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.163785 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb6e2137853dad5731e2e4f617c4e1e40ae7c2a428f662b42c2b8b4f2e2d59aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:33Z\\\",\\\"message\\\":\\\" 18:46:33.081808 6069 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:46:33.081843 6069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:46:33.081854 6069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:46:33.081852 6069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 18:46:33.081884 6069 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 18:46:33.081892 6069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:46:33.081905 6069 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:46:33.081909 6069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:46:33.081956 6069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 18:46:33.081991 6069 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 18:46:33.082027 6069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:46:33.082041 6069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:46:33.082090 6069 factory.go:656] Stopping watch factory\\\\nI0930 18:46:33.082111 6069 ovnkube.go:599] Stopped ovnkube\\\\nI0930 18:46:33.082143 6069 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 18:46:33.082159 6069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:34Z\\\",\\\"message\\\":\\\" column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 18:46:34.489550 6188 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nI0930 18:46:34.489547 6188 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:34.489593 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.181275 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.200910 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.215848 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.221569 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.221621 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.221639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.221664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.221683 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.233327 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.249731 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.277373 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.301440 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.324127 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.324176 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.324191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.324211 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.324225 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.337125 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.350014 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.365372 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.379997 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.393628 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.416681 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:41Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.427189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.427270 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.427296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.427330 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.427356 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.530448 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.530511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.530532 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.530556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.530575 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.634151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.634211 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.634222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.634244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.634257 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.737522 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.737598 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.737627 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.737662 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.737689 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.840761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.840828 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.840848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.840876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.840896 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.944470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.944541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.944558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.944583 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:41 crc kubenswrapper[4747]: I0930 18:46:41.944600 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:41Z","lastTransitionTime":"2025-09-30T18:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.048288 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.048349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.048367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.048393 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.048411 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.086968 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.087081 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:42 crc kubenswrapper[4747]: E0930 18:46:42.087210 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:42 crc kubenswrapper[4747]: E0930 18:46:42.087324 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.151650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.151719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.151737 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.151764 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.151784 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.255307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.255372 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.255389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.255414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.255432 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.358758 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.358835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.358852 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.358882 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.358903 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.461826 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.462102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.462269 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.462425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.462547 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.565620 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.565688 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.565706 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.565730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.565749 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.669243 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.669334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.669358 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.669399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.669423 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.772873 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.772957 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.772970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.773010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.773025 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.876524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.876609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.876635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.876666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.876688 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.980522 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.980604 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.980624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.980650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:42 crc kubenswrapper[4747]: I0930 18:46:42.980667 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:42Z","lastTransitionTime":"2025-09-30T18:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.083513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.083595 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.083618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.083646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.083669 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.086198 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.086323 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:43 crc kubenswrapper[4747]: E0930 18:46:43.086427 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:43 crc kubenswrapper[4747]: E0930 18:46:43.086583 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.186517 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.186571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.186588 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.186611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.186634 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.290498 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.290596 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.290615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.290642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.290660 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.394579 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.394651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.394669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.394694 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.394712 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.498284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.498369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.498388 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.498421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.498443 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.601782 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.602278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.602476 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.602706 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.602919 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.706544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.706596 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.706613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.706635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.706653 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.810232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.810748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.810980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.811219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.811410 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.914277 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.914333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.914350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.914377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:43 crc kubenswrapper[4747]: I0930 18:46:43.914395 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:43Z","lastTransitionTime":"2025-09-30T18:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.017415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.017491 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.017516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.017546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.017565 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.086582 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.086623 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:44 crc kubenswrapper[4747]: E0930 18:46:44.086787 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:44 crc kubenswrapper[4747]: E0930 18:46:44.086974 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.120779 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.120858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.120883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.120915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.120972 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.224336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.224399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.224417 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.224443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.224466 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.327817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.327914 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.327986 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.328013 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.328034 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.431124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.431226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.431245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.431268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.431289 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.534407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.534473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.534497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.534524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.534543 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.638088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.638178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.638198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.638227 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.638252 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.741718 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.741827 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.741857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.741888 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.741912 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.845611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.845669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.845724 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.845748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.845767 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.949533 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.949603 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.949625 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.949686 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:44 crc kubenswrapper[4747]: I0930 18:46:44.949707 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:44Z","lastTransitionTime":"2025-09-30T18:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.052838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.052983 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.053008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.053037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.053059 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.066824 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:45 crc kubenswrapper[4747]: E0930 18:46:45.067096 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:45 crc kubenswrapper[4747]: E0930 18:46:45.067302 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs podName:5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8 nodeName:}" failed. No retries permitted until 2025-09-30 18:46:53.067255385 +0000 UTC m=+52.726735529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs") pod "network-metrics-daemon-fbzb6" (UID: "5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.087041 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.087089 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:45 crc kubenswrapper[4747]: E0930 18:46:45.087228 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:45 crc kubenswrapper[4747]: E0930 18:46:45.087441 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.155951 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.156021 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.156038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.156063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.156080 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.259013 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.259117 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.259129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.259147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.259161 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.363062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.363129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.363150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.363175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.363193 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.478717 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.478898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.478997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.479032 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.479050 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.582513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.582585 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.582607 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.582638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.582659 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.686118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.686175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.686193 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.686223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.686247 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.789012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.789062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.789077 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.789100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.789118 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.891702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.891760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.891778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.891802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.891820 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.994691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.994755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.994778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.994806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:45 crc kubenswrapper[4747]: I0930 18:46:45.994828 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:45Z","lastTransitionTime":"2025-09-30T18:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.086494 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.086499 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:46 crc kubenswrapper[4747]: E0930 18:46:46.086695 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:46 crc kubenswrapper[4747]: E0930 18:46:46.086797 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.099283 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.099359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.099378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.099406 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.099422 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.202467 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.202532 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.202552 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.202576 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.202594 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.306145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.306217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.306231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.306250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.306264 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.409029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.409140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.409167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.409200 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.409223 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.511982 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.512078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.512098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.512158 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.512179 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.614771 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.614803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.614811 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.614825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.614835 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.718375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.718473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.718496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.718526 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.718549 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.821991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.822050 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.822066 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.822092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.822110 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.925274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.925346 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.925364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.925390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:46 crc kubenswrapper[4747]: I0930 18:46:46.925409 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:46Z","lastTransitionTime":"2025-09-30T18:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.028977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.029043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.029059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.029085 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.029103 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.087072 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.087118 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:47 crc kubenswrapper[4747]: E0930 18:46:47.087283 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:47 crc kubenswrapper[4747]: E0930 18:46:47.087488 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.132130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.132183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.132198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.132219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.132234 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.235908 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.236018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.236053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.236086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.236109 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.339175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.339248 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.339267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.339295 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.339321 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.442662 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.442741 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.442771 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.442801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.442824 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.546112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.546177 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.546197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.546222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.546242 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.650025 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.650100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.650126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.650155 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.650178 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.754399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.754472 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.754490 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.754516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.754534 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.858415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.858489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.858506 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.858532 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.858550 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.962734 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.962829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.962850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.962881 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:47 crc kubenswrapper[4747]: I0930 18:46:47.962912 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:47Z","lastTransitionTime":"2025-09-30T18:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.065823 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.065907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.065970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.066002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.066026 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.086231 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.086333 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:48 crc kubenswrapper[4747]: E0930 18:46:48.086879 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:48 crc kubenswrapper[4747]: E0930 18:46:48.087043 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.087268 4747 scope.go:117] "RemoveContainer" containerID="625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.111408 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.136012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.136066 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.136080 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.136098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.136112 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.144152 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:34Z\\\",\\\"message\\\":\\\" column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 18:46:34.489550 6188 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nI0930 18:46:34.489547 6188 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:34.489593 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: E0930 18:46:48.155356 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.160465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.160611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.160709 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.160811 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.160915 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.164246 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: E0930 18:46:48.181805 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.185797 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.187599 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.187710 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.187779 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.187848 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.187944 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: E0930 18:46:48.202405 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.205391 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.207755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.207811 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.207829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.207853 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.207871 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.222437 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: E0930 18:46:48.227464 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.232869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.232978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.233000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.233026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.233046 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.237982 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: E0930 18:46:48.255112 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: E0930 18:46:48.255742 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.259661 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.259835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.259897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.259921 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.260051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.260274 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.284301 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.303166 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.335025 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.349963 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.363370 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.363666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.363696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.363707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.363724 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.363735 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.373254 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.391845 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.407858 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.426038 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.467251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.467307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.467315 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.467329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.467339 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.482028 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/1.log" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.485841 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.485973 4747 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.503790 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.523248 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.548958 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.570424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.570494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.570513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.570534 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.570583 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.586896 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:34Z\\\",\\\"message\\\":\\\" column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 18:46:34.489550 6188 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nI0930 18:46:34.489547 6188 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:34.489593 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.604978 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.632261 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.643221 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.656259 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.667073 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.672948 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.673006 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.673020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.673038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.673053 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.682019 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.697457 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.730588 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.746345 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.762106 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.775807 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.775850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.775862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.775896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.775910 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.778799 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.791156 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.802691 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:48Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.878464 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.878515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.878527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.878546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.878558 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.981106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.981147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.981155 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.981171 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:48 crc kubenswrapper[4747]: I0930 18:46:48.981181 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:48Z","lastTransitionTime":"2025-09-30T18:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.084740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.084792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.084803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.084822 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.084835 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.086189 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.086210 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:49 crc kubenswrapper[4747]: E0930 18:46:49.086299 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:49 crc kubenswrapper[4747]: E0930 18:46:49.086382 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.135153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.189035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.189089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.189107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.189132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.189150 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.292211 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.292246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.292254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.292266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.292274 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.395843 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.395977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.395997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.396026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.396050 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.493438 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/2.log" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.494858 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/1.log" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.498869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.499453 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.499480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.499504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.499521 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.500613 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc" exitCode=1 Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.500690 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.500756 4747 scope.go:117] "RemoveContainer" containerID="625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.501765 4747 scope.go:117] "RemoveContainer" containerID="e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc" Sep 30 18:46:49 crc kubenswrapper[4747]: E0930 18:46:49.502104 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.521146 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.541437 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.582168 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.602387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.602458 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.602475 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.602504 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.602523 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.605144 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.627625 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.644727 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.662146 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.680961 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.700998 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.705780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.705846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.705864 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.705888 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.705909 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.719216 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.743405 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.774668 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625f074ac07ff44bd5444d427bed544bec77d11ca4298578a87366b59e5bdc53\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:34Z\\\",\\\"message\\\":\\\" column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 18:46:34.489550 6188 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nI0930 18:46:34.489547 6188 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:34.489593 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.794729 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.808640 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.808704 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.808721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.808747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.808767 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.815571 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.830745 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.851079 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.868164 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:49Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.912044 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.912100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.912117 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.912141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:49 crc kubenswrapper[4747]: I0930 18:46:49.912160 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:49Z","lastTransitionTime":"2025-09-30T18:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.015699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.015753 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.015773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.015797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.015815 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.087077 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:50 crc kubenswrapper[4747]: E0930 18:46:50.087605 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.087262 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:50 crc kubenswrapper[4747]: E0930 18:46:50.088134 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.119744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.119800 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.119818 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.119842 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.119860 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.223555 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.223671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.223687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.223714 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.223730 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.327453 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.327919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.328224 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.328523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.328709 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.433245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.433313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.433334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.433360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.433377 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.507837 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/2.log" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.515164 4747 scope.go:117] "RemoveContainer" containerID="e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc" Sep 30 18:46:50 crc kubenswrapper[4747]: E0930 18:46:50.515628 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.537475 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.537544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.537571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.537601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.537623 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.553988 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.574270 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.593579 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.616310 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.639994 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.640406 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.640444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.640460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.640483 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.640503 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.662989 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.681262 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.701483 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.720245 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.741911 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.744111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.744160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.744178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.744202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.744220 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.763907 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.788486 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.820176 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.840833 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.846761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.846849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.846869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.846892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.846909 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.857412 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.878423 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.893633 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:50Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.950643 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.950693 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.950709 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.950733 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:50 crc kubenswrapper[4747]: I0930 18:46:50.950751 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:50Z","lastTransitionTime":"2025-09-30T18:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.054365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.054837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.054863 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.054894 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.054917 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.086130 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.086209 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:51 crc kubenswrapper[4747]: E0930 18:46:51.086416 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:51 crc kubenswrapper[4747]: E0930 18:46:51.086581 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.104617 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.122653 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.142611 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.160247 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.160304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.160324 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.160348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.160368 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.168122 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.190777 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.210185 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.233985 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.253733 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.263671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.263747 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.263761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.263778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.263913 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.267240 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.283158 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.292186 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.309464 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.322107 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.335998 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.354012 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.363173 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.367969 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.368039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.368053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.368082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.368100 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.388886 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:51Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.471158 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.471201 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.471209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.471227 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.471238 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.574321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.574702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.574876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.575119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.575478 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.679123 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.679185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.679203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.679229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.679248 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.781769 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.781836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.781857 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.781883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.781903 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.885227 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.885291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.885309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.885333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.885354 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.988583 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.988651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.988671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.988701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:51 crc kubenswrapper[4747]: I0930 18:46:51.988721 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:51Z","lastTransitionTime":"2025-09-30T18:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.086396 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.086396 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:52 crc kubenswrapper[4747]: E0930 18:46:52.086605 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:52 crc kubenswrapper[4747]: E0930 18:46:52.086705 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.092042 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.092099 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.092116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.092145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.092163 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.195674 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.195748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.195783 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.195817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.195837 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.299413 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.299473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.299493 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.299518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.299538 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.402719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.402794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.402811 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.402837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.402854 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.506071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.506161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.506190 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.506221 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.506244 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.609126 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.609191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.609214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.609244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.609267 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.712557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.712972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.713111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.713256 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.713456 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.817589 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.817671 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.817695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.817725 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.817745 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.921316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.921385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.921403 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.921429 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:52 crc kubenswrapper[4747]: I0930 18:46:52.921447 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:52Z","lastTransitionTime":"2025-09-30T18:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.024799 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.024886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.024910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.024984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.025013 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.087050 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.087063 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.087322 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.087379 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.128161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.128222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.128239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.128262 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.128281 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.165176 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.165336 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.165429 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs podName:5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8 nodeName:}" failed. No retries permitted until 2025-09-30 18:47:09.165403208 +0000 UTC m=+68.824883362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs") pod "network-metrics-daemon-fbzb6" (UID: "5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.231551 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.231600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.231613 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.231631 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.231646 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.335108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.335144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.335152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.335165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.335174 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.439409 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.439475 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.439492 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.439518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.439539 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.549008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.549377 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.549528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.549706 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.549838 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.653078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.653180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.653202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.653224 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.653241 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.755873 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.755919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.755940 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.755955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.755966 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.860329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.860405 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.860430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.860460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.860485 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.872396 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.872624 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:47:25.872589184 +0000 UTC m=+85.532069338 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.872827 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.873088 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.873317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.873156 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.873635 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:47:25.873614188 +0000 UTC m=+85.533094332 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.873258 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.873861 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.873881 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.873952 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 18:47:25.873911067 +0000 UTC m=+85.533391211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.874287 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.874492 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:47:25.874467255 +0000 UTC m=+85.533947409 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.873823 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.873444 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.874982 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.875132 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:53 crc kubenswrapper[4747]: E0930 18:46:53.875312 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 18:47:25.875293002 +0000 UTC m=+85.534773156 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.963165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.963459 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.963630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.963802 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:53 crc kubenswrapper[4747]: I0930 18:46:53.964108 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:53Z","lastTransitionTime":"2025-09-30T18:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.067166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.067239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.067260 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.067289 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.067309 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.086973 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.086992 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:54 crc kubenswrapper[4747]: E0930 18:46:54.087191 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:54 crc kubenswrapper[4747]: E0930 18:46:54.087284 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.170445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.170529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.170554 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.170586 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.170610 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.273506 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.273596 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.273622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.273650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.273668 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.378118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.378197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.378222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.378254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.378275 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.481512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.481590 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.481615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.481644 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.481665 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.584135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.584209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.584228 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.584251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.584268 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.687428 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.687514 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.687539 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.687572 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.687593 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.791175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.791249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.791272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.791304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.791323 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.899029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.899457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.899623 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.899794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:54 crc kubenswrapper[4747]: I0930 18:46:54.899973 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:54Z","lastTransitionTime":"2025-09-30T18:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.004348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.004415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.004433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.004460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.004478 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.087262 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:55 crc kubenswrapper[4747]: E0930 18:46:55.087439 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.087554 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:55 crc kubenswrapper[4747]: E0930 18:46:55.087817 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.108519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.108583 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.108599 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.108622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.108640 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.212341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.212401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.212417 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.212441 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.212458 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.314866 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.315284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.315486 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.315664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.315837 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.420015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.420371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.420548 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.420742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.421048 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.524648 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.524707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.524726 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.524750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.524768 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.616265 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.633483 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.633859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.634385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.634803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.635044 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.635314 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.641900 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.663517 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.682069 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.704775 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.725519 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.738734 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.739146 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.739387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.739825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.741167 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.747882 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.771782 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.803128 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.822647 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.841300 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.844119 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.844177 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.844195 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.844776 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.844799 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.864087 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.883780 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.898836 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.919720 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.937707 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.947700 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.947764 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.947785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.947810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.947832 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:55Z","lastTransitionTime":"2025-09-30T18:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.968435 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:55 crc kubenswrapper[4747]: I0930 18:46:55.982430 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:55Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.051167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.051232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.051250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.051276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.051296 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.086644 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.086644 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:56 crc kubenswrapper[4747]: E0930 18:46:56.086840 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:56 crc kubenswrapper[4747]: E0930 18:46:56.087030 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.154956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.155022 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.155040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.155065 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.155084 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.258619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.258687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.258705 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.258730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.258749 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.362254 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.362326 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.362348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.362378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.362401 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.465562 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.465630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.465651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.465677 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.465695 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.569274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.569331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.569345 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.569373 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.569400 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.673678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.673743 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.673762 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.673786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.673804 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.778544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.778635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.778663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.778701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.778728 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.882716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.882789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.882804 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.882834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.882851 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.986385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.986444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.986461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.986484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:56 crc kubenswrapper[4747]: I0930 18:46:56.986503 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:56Z","lastTransitionTime":"2025-09-30T18:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.086761 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.086823 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:57 crc kubenswrapper[4747]: E0930 18:46:57.087008 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:57 crc kubenswrapper[4747]: E0930 18:46:57.087164 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.089730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.089830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.089851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.089910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.089979 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.193309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.193375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.193394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.193419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.193437 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.296455 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.296909 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.297153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.297388 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.297794 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.401091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.401362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.401485 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.401612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.401730 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.505511 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.505593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.505611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.505637 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.505657 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.608241 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.608314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.608334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.608361 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.608380 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.711763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.711825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.711843 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.711868 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.711886 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.815368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.815433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.815451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.815477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.815498 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.918461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.918527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.918552 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.918651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:57 crc kubenswrapper[4747]: I0930 18:46:57.918675 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:57Z","lastTransitionTime":"2025-09-30T18:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.022593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.022650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.022661 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.022686 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.022704 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.086647 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.086697 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:46:58 crc kubenswrapper[4747]: E0930 18:46:58.086854 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:46:58 crc kubenswrapper[4747]: E0930 18:46:58.087033 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.127016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.127077 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.127098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.127131 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.127165 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.230351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.230470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.230497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.230525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.230545 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.333269 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.333351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.333370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.333398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.333420 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.437102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.437160 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.437177 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.437201 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.437220 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.458385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.458446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.458470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.458495 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.458515 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: E0930 18:46:58.474606 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:58Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.480337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.480403 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.480428 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.480455 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.480476 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: E0930 18:46:58.500238 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:58Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.505468 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.505523 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.505548 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.505574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.505595 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: E0930 18:46:58.524656 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:58Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.529996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.530061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.530087 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.530117 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.530141 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: E0930 18:46:58.547195 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:58Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.553283 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.553394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.553452 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.553480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.553547 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: E0930 18:46:58.579914 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:46:58Z is after 2025-08-24T17:21:41Z" Sep 30 18:46:58 crc kubenswrapper[4747]: E0930 18:46:58.580222 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.583100 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.583358 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.583499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.583681 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.583839 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.687640 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.687714 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.687731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.687759 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.687779 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.791279 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.791342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.791366 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.791402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.791427 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.894754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.894831 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.894855 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.894889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.894913 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.998594 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.998721 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.998777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.998809 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:58 crc kubenswrapper[4747]: I0930 18:46:58.998829 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:58Z","lastTransitionTime":"2025-09-30T18:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.086621 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.086704 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:46:59 crc kubenswrapper[4747]: E0930 18:46:59.087079 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:46:59 crc kubenswrapper[4747]: E0930 18:46:59.087204 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.102489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.102547 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.102565 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.102591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.102610 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.205089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.205143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.205161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.205184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.205204 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.308195 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.308251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.308272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.308298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.308318 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.412139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.412233 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.412256 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.412286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.412314 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.516068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.516178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.516199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.516223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.516241 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.619480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.619547 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.619566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.619624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.619643 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.723037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.723086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.723098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.723118 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.723133 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.826191 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.826315 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.826337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.826361 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.826381 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.929996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.930062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.930080 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.930104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:46:59 crc kubenswrapper[4747]: I0930 18:46:59.930121 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:46:59Z","lastTransitionTime":"2025-09-30T18:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.032889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.033028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.033052 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.033082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.033103 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.087179 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.087198 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:00 crc kubenswrapper[4747]: E0930 18:47:00.087370 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:00 crc kubenswrapper[4747]: E0930 18:47:00.087559 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.136608 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.136674 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.136691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.136715 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.136732 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.240309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.240431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.240451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.240477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.240499 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.343670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.343740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.343763 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.343794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.343817 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.447422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.447516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.447558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.447592 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.447615 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.551320 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.551385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.551408 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.551435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.551456 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.654501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.654564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.654586 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.654615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.654634 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.758611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.758657 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.758670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.758725 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.758738 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.862141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.862227 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.862246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.862271 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.862306 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.966691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.966774 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.966797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.966827 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:00 crc kubenswrapper[4747]: I0930 18:47:00.966847 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:00Z","lastTransitionTime":"2025-09-30T18:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.071238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.071293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.071305 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.071327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.071342 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.086540 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:01 crc kubenswrapper[4747]: E0930 18:47:01.086744 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.087621 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:01 crc kubenswrapper[4747]: E0930 18:47:01.087837 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.113497 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.133564 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.152887 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.177390 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.177795 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.177842 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.177860 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.177885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.177905 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.210029 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.229835 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.243663 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.255818 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.267912 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.279767 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.280956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.280998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.281010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.281029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.281042 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.296941 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.314091 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.347181 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.362464 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.378866 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.383128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.383178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.383197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.383219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.383236 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.396125 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.409911 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.424368 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:01Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.492182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.492221 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.492232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.492251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.492264 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.596302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.596390 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.596414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.596443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.596463 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.700063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.700306 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.700315 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.700328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.700339 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.803286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.803334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.803350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.803370 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.803387 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.907188 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.907243 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.907258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.907277 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:01 crc kubenswrapper[4747]: I0930 18:47:01.907293 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:01Z","lastTransitionTime":"2025-09-30T18:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.010652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.010711 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.010731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.010754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.010771 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.087280 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.087323 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:02 crc kubenswrapper[4747]: E0930 18:47:02.087914 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:02 crc kubenswrapper[4747]: E0930 18:47:02.088158 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.113730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.113799 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.113818 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.113845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.113866 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.217832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.218322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.218542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.218742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.218897 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.321819 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.321917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.321968 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.322001 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.322025 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.425150 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.425220 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.425239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.425267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.425287 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.530049 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.530503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.530781 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.531558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.531865 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.635620 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.636057 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.636215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.636364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.636545 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.739557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.742494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.743225 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.743298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.743330 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.846138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.846233 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.846251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.846276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.846294 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.948859 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.948915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.948962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.948985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:02 crc kubenswrapper[4747]: I0930 18:47:02.949003 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:02Z","lastTransitionTime":"2025-09-30T18:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.051626 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.051684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.051702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.051727 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.051746 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.086593 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:03 crc kubenswrapper[4747]: E0930 18:47:03.086776 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.086885 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:03 crc kubenswrapper[4747]: E0930 18:47:03.087200 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.155436 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.155497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.155515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.155544 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.155571 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.259246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.259666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.259816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.260029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.260184 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.363169 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.363238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.363251 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.363271 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.363284 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.465572 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.465611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.465652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.465670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.465682 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.568269 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.568325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.568342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.568368 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.568385 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.671557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.671630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.671655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.671684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.671705 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.774768 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.774834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.774846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.774885 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.774899 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.878166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.878223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.878235 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.878257 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.878271 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.981238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.981301 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.981313 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.981329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:03 crc kubenswrapper[4747]: I0930 18:47:03.981368 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:03Z","lastTransitionTime":"2025-09-30T18:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.083845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.083880 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.083891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.083907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.083935 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.086272 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.086272 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:04 crc kubenswrapper[4747]: E0930 18:47:04.086379 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:04 crc kubenswrapper[4747]: E0930 18:47:04.086446 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.087615 4747 scope.go:117] "RemoveContainer" containerID="e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc" Sep 30 18:47:04 crc kubenswrapper[4747]: E0930 18:47:04.088113 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.187148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.187179 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.187212 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.187229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.187240 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.290389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.290437 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.290447 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.290469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.290481 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.393095 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.393138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.393149 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.393167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.393179 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.495470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.495538 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.495563 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.495592 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.495614 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.597903 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.598005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.598032 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.598063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.598087 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.700795 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.700858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.700878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.700903 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.700922 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.803445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.803510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.803539 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.803571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.803595 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.906861 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.906970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.906985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.907002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:04 crc kubenswrapper[4747]: I0930 18:47:04.907013 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:04Z","lastTransitionTime":"2025-09-30T18:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.010949 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.010983 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.010993 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.011008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.011018 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.087177 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.087253 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:05 crc kubenswrapper[4747]: E0930 18:47:05.087329 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:05 crc kubenswrapper[4747]: E0930 18:47:05.087460 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.114327 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.114367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.114378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.114394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.114407 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.216917 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.216972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.216980 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.216996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.217005 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.319353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.319399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.319411 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.319427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.319441 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.422286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.422343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.422360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.422385 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.422405 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.525456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.525499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.525512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.525532 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.525545 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.628079 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.628135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.628148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.628166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.628180 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.731739 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.731787 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.731841 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.731858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.731873 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.834813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.834887 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.834907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.834973 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.834993 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.938540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.938609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.938638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.938664 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:05 crc kubenswrapper[4747]: I0930 18:47:05.938682 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:05Z","lastTransitionTime":"2025-09-30T18:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.042008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.042092 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.042114 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.042140 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.042157 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.086363 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.086440 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:06 crc kubenswrapper[4747]: E0930 18:47:06.086500 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:06 crc kubenswrapper[4747]: E0930 18:47:06.086606 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.143995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.144044 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.144061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.144083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.144100 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.246541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.246586 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.246602 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.246625 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.246642 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.349087 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.349124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.349136 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.349338 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.349351 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.452258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.452553 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.452705 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.452815 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.452940 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.556104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.556166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.556180 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.556202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.556224 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.659064 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.659466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.659651 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.659821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.660457 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.763449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.763497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.763507 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.763525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.763537 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.866402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.866446 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.866457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.866473 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.866486 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.969135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.969198 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.969215 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.969238 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:06 crc kubenswrapper[4747]: I0930 18:47:06.969256 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:06Z","lastTransitionTime":"2025-09-30T18:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.071741 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.071794 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.071806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.071825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.071837 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.087121 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:07 crc kubenswrapper[4747]: E0930 18:47:07.087251 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.087293 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:07 crc kubenswrapper[4747]: E0930 18:47:07.087483 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.175012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.175063 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.175083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.175106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.175123 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.278098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.278148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.278164 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.278185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.278203 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.381340 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.381551 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.381728 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.381903 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.382101 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.484777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.484818 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.484829 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.484845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.484858 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.586806 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.587292 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.587355 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.587421 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.587500 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.690002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.690106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.690185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.690285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.690378 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.793121 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.793367 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.793514 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.793672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.793804 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.897116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.897161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.897172 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.897190 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:07 crc kubenswrapper[4747]: I0930 18:47:07.897203 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:07Z","lastTransitionTime":"2025-09-30T18:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.000052 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.000125 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.000151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.000190 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.000216 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.086123 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:08 crc kubenswrapper[4747]: E0930 18:47:08.086252 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.086484 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:08 crc kubenswrapper[4747]: E0930 18:47:08.086910 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.103055 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.103086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.103102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.103123 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.103139 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.206051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.206101 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.206113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.206128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.206140 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.308614 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.308645 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.308655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.308668 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.308679 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.411208 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.411255 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.411265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.411282 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.411302 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.513983 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.514424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.514546 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.514692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.514839 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.617627 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.617666 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.617676 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.617691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.617703 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.641298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.641318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.641325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.641335 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.641342 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: E0930 18:47:08.658333 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:08Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.662107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.662130 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.662138 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.662151 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.662162 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: E0930 18:47:08.684738 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:08Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.689779 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.689832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.689850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.689874 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.689892 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: E0930 18:47:08.708916 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:08Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.713682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.713709 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.713720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.713734 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.713744 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: E0930 18:47:08.743383 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:08Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.763910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.764001 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.764019 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.764047 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.764068 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: E0930 18:47:08.783625 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:08Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:08 crc kubenswrapper[4747]: E0930 18:47:08.783786 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.785568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.785600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.785611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.785627 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.785638 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.888357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.888393 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.888401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.888416 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.888427 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.990816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.990858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.990870 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.990886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:08 crc kubenswrapper[4747]: I0930 18:47:08.990896 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:08Z","lastTransitionTime":"2025-09-30T18:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.086518 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.086616 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:09 crc kubenswrapper[4747]: E0930 18:47:09.086701 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:09 crc kubenswrapper[4747]: E0930 18:47:09.086881 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.093373 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.093420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.093436 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.093457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.093472 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.196328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.196360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.196371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.196389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.196401 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.250051 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:09 crc kubenswrapper[4747]: E0930 18:47:09.250278 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:47:09 crc kubenswrapper[4747]: E0930 18:47:09.250375 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs podName:5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8 nodeName:}" failed. No retries permitted until 2025-09-30 18:47:41.250349709 +0000 UTC m=+100.909829853 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs") pod "network-metrics-daemon-fbzb6" (UID: "5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.299132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.299174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.299185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.299202 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.299217 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.401337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.401398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.401415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.401440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.401461 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.504561 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.504622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.504640 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.504669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.504687 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.607845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.607901 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.607920 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.607974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.607992 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.710537 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.710600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.710618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.710644 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.710661 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.813817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.813858 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.813869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.813884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.813894 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.916230 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.916285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.916303 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.916326 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:09 crc kubenswrapper[4747]: I0930 18:47:09.916344 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:09Z","lastTransitionTime":"2025-09-30T18:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.018695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.018750 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.018768 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.018791 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.018808 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.087025 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.087112 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:10 crc kubenswrapper[4747]: E0930 18:47:10.087270 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:10 crc kubenswrapper[4747]: E0930 18:47:10.087375 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.122043 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.122084 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.122093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.122107 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.122118 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.225325 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.225398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.225418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.225444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.225462 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.328606 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.328672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.328690 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.328715 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.328733 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.431514 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.431574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.431591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.431617 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.431635 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.535600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.535655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.535667 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.535687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.535703 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.592467 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/0.log" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.592565 4747 generic.go:334] "Generic (PLEG): container finished" podID="34f8698b-7682-4b27-99d0-d72fff30d5a8" containerID="0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c" exitCode=1 Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.592614 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zjq4" event={"ID":"34f8698b-7682-4b27-99d0-d72fff30d5a8","Type":"ContainerDied","Data":"0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.593294 4747 scope.go:117] "RemoveContainer" containerID="0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.626067 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.642780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.642833 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.642849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.642876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.642896 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.643232 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.660034 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.674252 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.689591 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.711242 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.732726 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.745135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.745185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.745194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.745209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.745220 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.747105 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.761868 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.778116 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.791436 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.807542 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.824322 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.845898 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.847228 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.847250 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.847258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.847272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.847282 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.859859 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.868414 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.882381 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:09Z\\\",\\\"message\\\":\\\"2025-09-30T18:46:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611\\\\n2025-09-30T18:46:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611 to /host/opt/cni/bin/\\\\n2025-09-30T18:46:24Z [verbose] multus-daemon started\\\\n2025-09-30T18:46:24Z [verbose] Readiness Indicator file check\\\\n2025-09-30T18:47:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.893078 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:10Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.949572 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.949612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.949621 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.949633 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:10 crc kubenswrapper[4747]: I0930 18:47:10.949645 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:10Z","lastTransitionTime":"2025-09-30T18:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.051994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.052033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.052048 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.052064 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.052077 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.086190 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:11 crc kubenswrapper[4747]: E0930 18:47:11.086315 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.086412 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:11 crc kubenswrapper[4747]: E0930 18:47:11.086611 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.102648 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.119878 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.131632 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.151757 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.154169 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.154189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.154197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.154239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.154250 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.162633 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.173399 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.186018 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.198856 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.208627 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.222716 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.246103 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.256854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.256883 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.256895 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.256912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.256950 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.264880 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.274795 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.285010 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:09Z\\\",\\\"message\\\":\\\"2025-09-30T18:46:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611\\\\n2025-09-30T18:46:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611 to /host/opt/cni/bin/\\\\n2025-09-30T18:46:24Z [verbose] multus-daemon started\\\\n2025-09-30T18:46:24Z [verbose] Readiness Indicator file check\\\\n2025-09-30T18:47:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.293522 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.313377 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.325640 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.335915 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.358984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.359026 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.359037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.359054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.359065 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.461174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.461267 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.461293 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.461383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.461484 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.564797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.564872 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.564914 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.564977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.564995 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.598818 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/0.log" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.599185 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zjq4" event={"ID":"34f8698b-7682-4b27-99d0-d72fff30d5a8","Type":"ContainerStarted","Data":"f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.619197 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.632533 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.648724 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.659895 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.667775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.667838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.667852 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.667869 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.667882 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.670630 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.685969 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.696901 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.708032 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.724176 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.744994 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.758279 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.770505 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.770781 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.770801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.770814 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.770832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.770844 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.783228 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.795008 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.804744 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.821079 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.832688 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.849689 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:09Z\\\",\\\"message\\\":\\\"2025-09-30T18:46:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611\\\\n2025-09-30T18:46:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611 to /host/opt/cni/bin/\\\\n2025-09-30T18:46:24Z [verbose] multus-daemon started\\\\n2025-09-30T18:46:24Z [verbose] Readiness Indicator file check\\\\n2025-09-30T18:47:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:47:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:11Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.875698 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.875786 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.875810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.875840 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.875873 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.978672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.978732 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.978749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.978771 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:11 crc kubenswrapper[4747]: I0930 18:47:11.978784 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:11Z","lastTransitionTime":"2025-09-30T18:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.082342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.082435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.082455 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.082484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.082502 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.086822 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.086856 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:12 crc kubenswrapper[4747]: E0930 18:47:12.087055 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:12 crc kubenswrapper[4747]: E0930 18:47:12.087176 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.184836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.184899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.184956 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.184994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.185019 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.287775 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.287832 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.287849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.287874 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.287895 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.390966 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.391033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.391078 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.391105 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.391123 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.493744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.493816 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.493835 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.493861 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.493879 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.596659 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.596720 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.596744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.596774 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.596795 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.699308 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.699372 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.699393 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.699419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.699438 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.801222 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.801272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.801282 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.801300 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.801311 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.904360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.904420 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.904438 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.904461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:12 crc kubenswrapper[4747]: I0930 18:47:12.904479 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:12Z","lastTransitionTime":"2025-09-30T18:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.008192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.008337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.008363 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.008395 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.008418 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.087259 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.087343 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:13 crc kubenswrapper[4747]: E0930 18:47:13.087499 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:13 crc kubenswrapper[4747]: E0930 18:47:13.087632 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.110977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.111035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.111058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.111083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.111101 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.213894 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.213987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.214006 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.214030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.214050 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.317011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.317073 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.317091 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.317116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.317136 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.420638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.420713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.420731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.420754 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.420770 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.523900 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.523996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.524020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.524054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.524075 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.625998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.626029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.626037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.626050 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.626060 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.729541 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.729612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.729632 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.729656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.729677 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.832167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.832275 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.832297 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.832321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.832342 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.935639 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.935707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.935728 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.935814 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:13 crc kubenswrapper[4747]: I0930 18:47:13.935831 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:13Z","lastTransitionTime":"2025-09-30T18:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.039707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.039790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.039810 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.039837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.039856 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.086335 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:14 crc kubenswrapper[4747]: E0930 18:47:14.086539 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.086677 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:14 crc kubenswrapper[4747]: E0930 18:47:14.087257 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.102990 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.143343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.143427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.143440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.143463 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.143475 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.247196 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.247277 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.247299 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.247333 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.247356 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.350357 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.350412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.350423 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.350445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.350458 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.454182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.454261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.454286 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.454316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.454339 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.557575 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.557635 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.557655 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.557680 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.557700 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.661253 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.661318 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.661337 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.661364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.661382 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.764477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.764512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.764519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.764536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.764545 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.868082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.868135 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.868153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.868178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.868196 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.972069 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.972124 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.972141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.972164 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:14 crc kubenswrapper[4747]: I0930 18:47:14.972182 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:14Z","lastTransitionTime":"2025-09-30T18:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.076314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.076488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.076510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.076536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.076554 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.086814 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:15 crc kubenswrapper[4747]: E0930 18:47:15.087046 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.088445 4747 scope.go:117] "RemoveContainer" containerID="e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.089243 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:15 crc kubenswrapper[4747]: E0930 18:47:15.089432 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.180220 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.180285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.180308 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.180336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.180353 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.283638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.283699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.283714 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.283735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.283748 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.386682 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.386741 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.386751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.386771 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.386783 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.489855 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.489991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.490030 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.490064 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.490085 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.593598 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.593692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.593716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.593749 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.593778 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.615236 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/2.log" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.618490 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.620100 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.642120 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.663671 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.681455 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.697209 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.697265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.697283 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.697311 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.697330 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.698430 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.733073 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.759762 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.773106 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"973afb09-20b0-46c6-bea9-e822e07c64f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c682b33e155a43e48d8173b084a93df1a6badd45c3c1fc9dbeb8daa9959952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.795446 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.799777 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.799819 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.799833 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.799854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.799868 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.808425 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.821653 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:09Z\\\",\\\"message\\\":\\\"2025-09-30T18:46:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611\\\\n2025-09-30T18:46:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611 to /host/opt/cni/bin/\\\\n2025-09-30T18:46:24Z [verbose] multus-daemon started\\\\n2025-09-30T18:46:24Z [verbose] Readiness Indicator file check\\\\n2025-09-30T18:47:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:47:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.829302 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.845852 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.858049 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.877641 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.893893 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.903179 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.903236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.903256 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.903281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.903299 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:15Z","lastTransitionTime":"2025-09-30T18:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.911254 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.931278 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.948732 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:15 crc kubenswrapper[4747]: I0930 18:47:15.962909 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:15Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.006225 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.006282 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.006297 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.006319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.006346 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.086572 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.086649 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:16 crc kubenswrapper[4747]: E0930 18:47:16.086798 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:16 crc kubenswrapper[4747]: E0930 18:47:16.086868 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.109095 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.109157 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.109177 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.109203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.109221 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.211658 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.211724 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.211744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.211771 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.211791 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.314997 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.315054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.315072 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.315098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.315114 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.418089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.418173 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.418190 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.418216 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.418234 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.521781 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.521862 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.521887 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.521918 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.521988 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.624989 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.625027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.625037 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.625051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.625065 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.625413 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/3.log" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.626522 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/2.log" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.630827 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5" exitCode=1 Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.630873 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.631089 4747 scope.go:117] "RemoveContainer" containerID="e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.635048 4747 scope.go:117] "RemoveContainer" containerID="4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5" Sep 30 18:47:16 crc kubenswrapper[4747]: E0930 18:47:16.635524 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.660136 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.681834 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.700994 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.720213 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.728449 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.728501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.728519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.728542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.728558 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.746315 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.777790 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5903d00e6f0fffa12f72f2407b6cb3eeb1ed021b83ad35f6531b8ee1abc8dbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:46:49Z\\\",\\\"message\\\":\\\"UUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 18:46:49.006468 6403 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:16Z\\\",\\\"message\\\":\\\"formers/factory.go:160\\\\nI0930 18:47:16.172150 6760 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.172392 6760 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.172555 6760 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.173014 6760 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:47:16.173050 6760 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:47:16.173081 6760 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:47:16.173167 6760 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:47:16.173185 6760 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:47:16.173108 6760 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:47:16.173210 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18:47:16.173217 6760 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:47:16.173240 6760 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 18:47:16.173288 6760 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:47:16.173313 6760 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.794652 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"973afb09-20b0-46c6-bea9-e822e07c64f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c682b33e155a43e48d8173b084a93df1a6badd45c3c1fc9dbeb8daa9959952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.814569 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.828063 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.831384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.831442 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.831460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.831484 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.831501 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.844220 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:09Z\\\",\\\"message\\\":\\\"2025-09-30T18:46:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611\\\\n2025-09-30T18:46:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611 to /host/opt/cni/bin/\\\\n2025-09-30T18:46:24Z [verbose] multus-daemon started\\\\n2025-09-30T18:46:24Z [verbose] Readiness Indicator file check\\\\n2025-09-30T18:47:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:47:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.857004 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.885336 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.904651 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.920906 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.934264 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.934323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.934336 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.934352 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.934364 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:16Z","lastTransitionTime":"2025-09-30T18:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.938826 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.955254 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.970204 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.982999 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:16 crc kubenswrapper[4747]: I0930 18:47:16.995233 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:16Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.037058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.037128 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.037152 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.037182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.037203 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.086261 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.086343 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:17 crc kubenswrapper[4747]: E0930 18:47:17.086462 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:17 crc kubenswrapper[4747]: E0930 18:47:17.086668 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.140430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.140515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.140540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.140634 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.140655 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.243641 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.243702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.243719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.243743 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.243760 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.347002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.347049 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.347062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.347079 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.347091 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.449912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.450005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.450021 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.450047 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.450066 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.552363 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.552424 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.552443 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.552469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.552489 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.636508 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/3.log" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.641875 4747 scope.go:117] "RemoveContainer" containerID="4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5" Sep 30 18:47:17 crc kubenswrapper[4747]: E0930 18:47:17.642232 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.654955 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.655039 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.655058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.655082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.655098 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.663104 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.684804 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.696772 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.711499 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.737225 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:16Z\\\",\\\"message\\\":\\\"formers/factory.go:160\\\\nI0930 18:47:16.172150 6760 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.172392 6760 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.172555 6760 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.173014 6760 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:47:16.173050 6760 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:47:16.173081 6760 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:47:16.173167 6760 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:47:16.173185 6760 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:47:16.173108 6760 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:47:16.173210 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18:47:16.173217 6760 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:47:16.173240 6760 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 18:47:16.173288 6760 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:47:16.173313 6760 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:47:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.753030 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.757878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.757948 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.757970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.758001 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.758025 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.770297 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.783657 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.804185 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:09Z\\\",\\\"message\\\":\\\"2025-09-30T18:46:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611\\\\n2025-09-30T18:46:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611 to /host/opt/cni/bin/\\\\n2025-09-30T18:46:24Z [verbose] multus-daemon started\\\\n2025-09-30T18:46:24Z [verbose] Readiness Indicator file check\\\\n2025-09-30T18:47:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:47:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.818109 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.828997 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"973afb09-20b0-46c6-bea9-e822e07c64f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c682b33e155a43e48d8173b084a93df1a6badd45c3c1fc9dbeb8daa9959952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.844425 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.857670 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.861527 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.861568 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.861585 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.861609 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.861623 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.881872 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.901738 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.919840 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.938029 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.956583 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.967407 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.967496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.967516 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.967545 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.967572 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:17Z","lastTransitionTime":"2025-09-30T18:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:17 crc kubenswrapper[4747]: I0930 18:47:17.978974 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:17Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.071141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.071229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.071248 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.071287 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.071305 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.086195 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.086311 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:18 crc kubenswrapper[4747]: E0930 18:47:18.086693 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:18 crc kubenswrapper[4747]: E0930 18:47:18.086962 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.174386 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.174496 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.174520 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.174549 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.174570 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.277691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.277770 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.277792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.277824 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.277845 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.380242 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.380302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.380319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.380342 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.380359 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.483509 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.483564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.483582 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.483604 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.483621 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.586581 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.586652 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.586669 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.586693 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.586711 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.690025 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.690098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.690116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.690142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.690160 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.793184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.793289 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.793363 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.793432 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.793500 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.896618 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.896680 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.896696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.896718 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.896737 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.987856 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.987987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.988016 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.988048 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:18 crc kubenswrapper[4747]: I0930 18:47:18.988078 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:18Z","lastTransitionTime":"2025-09-30T18:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: E0930 18:47:19.001471 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:18Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.007169 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.007210 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.007219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.007236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.007247 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: E0930 18:47:19.024239 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:19Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.027646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.027684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.027695 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.027712 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.027724 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: E0930 18:47:19.044683 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:19Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.049992 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.050053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.050065 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.050085 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.050103 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: E0930 18:47:19.070012 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:19Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.074845 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.074893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.074908 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.074943 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.074956 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.086334 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.086338 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:19 crc kubenswrapper[4747]: E0930 18:47:19.086509 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:19 crc kubenswrapper[4747]: E0930 18:47:19.086687 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:19 crc kubenswrapper[4747]: E0930 18:47:19.092423 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:19Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:19 crc kubenswrapper[4747]: E0930 18:47:19.092537 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.094663 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.094692 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.094701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.094714 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.094725 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.197591 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.197686 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.197716 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.197746 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.197767 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.300105 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.300162 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.300203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.300223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.300237 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.403323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.403419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.403444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.403529 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.403556 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.507281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.507365 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.507391 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.507422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.507446 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.610398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.610463 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.610480 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.610503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.610520 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.713755 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.713834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.713856 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.713886 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.713910 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.817236 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.817320 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.817348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.817383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.817406 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.921009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.921085 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.921102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.921129 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:19 crc kubenswrapper[4747]: I0930 18:47:19.921150 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:19Z","lastTransitionTime":"2025-09-30T18:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.024301 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.024373 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.024392 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.024417 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.024436 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.086293 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.086373 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:20 crc kubenswrapper[4747]: E0930 18:47:20.086507 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:20 crc kubenswrapper[4747]: E0930 18:47:20.086772 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.127961 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.128029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.128064 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.128104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.128131 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.230512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.230574 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.230598 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.230628 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.230650 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.333773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.333823 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.333849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.333871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.333889 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.436170 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.436241 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.436265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.436295 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.436319 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.539356 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.539404 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.539422 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.539444 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.539457 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.642998 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.643061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.643084 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.643113 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.643134 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.747000 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.747115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.747141 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.747174 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.747200 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.851376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.851479 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.851509 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.851542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.851611 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.955328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.955412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.955434 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.955462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:20 crc kubenswrapper[4747]: I0930 18:47:20.955487 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:20Z","lastTransitionTime":"2025-09-30T18:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.059065 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.059144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.059168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.059199 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.059219 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.086971 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.087076 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:21 crc kubenswrapper[4747]: E0930 18:47:21.087266 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:21 crc kubenswrapper[4747]: E0930 18:47:21.087519 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.109983 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.132445 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.156001 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:09Z\\\",\\\"message\\\":\\\"2025-09-30T18:46:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611\\\\n2025-09-30T18:46:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611 to /host/opt/cni/bin/\\\\n2025-09-30T18:46:24Z [verbose] multus-daemon started\\\\n2025-09-30T18:46:24Z [verbose] Readiness Indicator file check\\\\n2025-09-30T18:47:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:47:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.162292 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.162341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.162364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.162393 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.162416 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.175597 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.194917 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"973afb09-20b0-46c6-bea9-e822e07c64f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c682b33e155a43e48d8173b084a93df1a6badd45c3c1fc9dbeb8daa9959952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.214669 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.235338 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.260516 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.266025 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.266102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.266121 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.266154 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.266175 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.279539 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.295346 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.316475 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.335425 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.358170 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.370667 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.370738 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.370758 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.370789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.370807 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.380507 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.400196 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.418127 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.440831 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.475120 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.475187 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.475213 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.475284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.475309 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.477023 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:16Z\\\",\\\"message\\\":\\\"formers/factory.go:160\\\\nI0930 18:47:16.172150 6760 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.172392 6760 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.172555 6760 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.173014 6760 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:47:16.173050 6760 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:47:16.173081 6760 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:47:16.173167 6760 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:47:16.173185 6760 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:47:16.173108 6760 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:47:16.173210 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18:47:16.173217 6760 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:47:16.173240 6760 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 18:47:16.173288 6760 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:47:16.173313 6760 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:47:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.500828 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:21Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.578284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.578353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.578397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.578428 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.578450 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.681268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.681340 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.681364 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.681392 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.681412 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.784550 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.785006 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.785084 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.785167 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.785250 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.889713 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.889790 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.889805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.889842 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.889857 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.994189 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.994265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.994285 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.994311 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:21 crc kubenswrapper[4747]: I0930 18:47:21.994329 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:21Z","lastTransitionTime":"2025-09-30T18:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.086210 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.086352 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:22 crc kubenswrapper[4747]: E0930 18:47:22.087469 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:22 crc kubenswrapper[4747]: E0930 18:47:22.087669 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.097893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.097986 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.098004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.098029 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.098046 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.201102 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.201179 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.201197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.201223 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.201241 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.304278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.304344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.304362 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.304387 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.304406 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.407864 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.407965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.407986 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.408011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.408029 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.511020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.511284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.511542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.511974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.512420 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.616278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.616348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.616378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.616406 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.616424 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.719751 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.719849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.719867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.719891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.719911 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.823435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.823534 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.823552 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.823576 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.823594 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.926872 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.927004 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.927027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.927054 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:22 crc kubenswrapper[4747]: I0930 18:47:22.927071 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:22Z","lastTransitionTime":"2025-09-30T18:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.030012 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.030572 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.030736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.030871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.031059 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.087218 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:23 crc kubenswrapper[4747]: E0930 18:47:23.087777 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.088208 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:23 crc kubenswrapper[4747]: E0930 18:47:23.088365 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.133687 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.133745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.133773 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.133801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.133820 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.237276 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.237351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.237369 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.237398 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.237417 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.341027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.341106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.341132 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.341166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.341192 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.444996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.445086 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.445112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.445139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.445158 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.550133 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.550547 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.550691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.550831 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.551146 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.655217 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.655296 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.655349 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.655381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.655405 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.758744 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.759225 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.759442 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.759653 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.759854 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.863284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.863355 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.863376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.863405 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.863423 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.966419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.966880 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.967168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.967399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:23 crc kubenswrapper[4747]: I0930 18:47:23.967601 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:23Z","lastTransitionTime":"2025-09-30T18:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.071502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.071854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.072083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.072295 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.072479 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.087034 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.087070 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:24 crc kubenswrapper[4747]: E0930 18:47:24.087380 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:24 crc kubenswrapper[4747]: E0930 18:47:24.087407 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.176727 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.176792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.176808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.176834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.176855 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.280383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.280466 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.280485 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.280510 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.280530 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.384185 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.384245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.384266 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.384292 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.384309 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.487536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.487679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.487719 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.487752 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.487776 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.590892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.590994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.591008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.591028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.591040 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.693825 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.693983 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.694008 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.694051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.694071 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.797343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.797395 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.797418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.797441 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.797455 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.900309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.900358 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.900373 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.900393 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:24 crc kubenswrapper[4747]: I0930 18:47:24.900406 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:24Z","lastTransitionTime":"2025-09-30T18:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.003891 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.003975 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.003995 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.004020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.004041 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.086507 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.086562 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.086731 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.086862 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.107169 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.107244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.107264 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.107294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.107316 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.210856 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.210985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.211011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.211036 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.211054 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.314697 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.314764 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.314785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.314812 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.314832 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.418259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.418347 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.418433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.418502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.418528 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.522005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.522088 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.522108 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.522144 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.522169 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.625622 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.625691 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.625710 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.625742 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.625761 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.729079 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.729148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.729165 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.729192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.729211 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.832556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.832605 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.832619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.832636 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.832648 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.936487 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.936565 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.936589 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.936624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.936654 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:25Z","lastTransitionTime":"2025-09-30T18:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.957766 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.958526 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.958482491 +0000 UTC m=+149.617962655 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.958661 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.958743 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.958820 4747 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.958842 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:25 crc kubenswrapper[4747]: I0930 18:47:25.958903 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.958972 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.958900349 +0000 UTC m=+149.618380503 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959014 4747 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959083 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959114 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959120 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.959091692 +0000 UTC m=+149.618571886 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959133 4747 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959132 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959170 4747 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959187 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.959169124 +0000 UTC m=+149.618649358 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959193 4747 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:47:25 crc kubenswrapper[4747]: E0930 18:47:25.959253 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.959235965 +0000 UTC m=+149.618716119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.039465 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.039557 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.039575 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.039600 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.039618 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.086552 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.086567 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:26 crc kubenswrapper[4747]: E0930 18:47:26.086833 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:26 crc kubenswrapper[4747]: E0930 18:47:26.086714 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.144524 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.144611 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.144642 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.144679 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.144718 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.248321 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.248402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.248430 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.248458 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.248477 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.352197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.352265 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.352283 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.352307 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.352327 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.455684 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.455731 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.455748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.455772 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.455789 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.559097 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.559143 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.559161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.559183 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.559199 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.663065 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.663122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.663139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.663162 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.663179 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.766554 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.767010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.767033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.767057 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.767077 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.870162 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.870220 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.870240 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.870263 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.870281 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.973234 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.973288 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.973308 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.973334 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:26 crc kubenswrapper[4747]: I0930 18:47:26.973351 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:26Z","lastTransitionTime":"2025-09-30T18:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.076204 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.076272 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.076290 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.076316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.076336 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.087181 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.087415 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:27 crc kubenswrapper[4747]: E0930 18:47:27.087608 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:27 crc kubenswrapper[4747]: E0930 18:47:27.087847 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.179237 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.179311 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.179329 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.179359 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.179386 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.282850 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.282912 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.282963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.282988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.283034 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.386168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.386244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.386268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.386294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.386312 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.489698 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.489762 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.489797 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.489826 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.489851 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.592360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.592433 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.592460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.592488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.592512 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.695803 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.695882 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.695903 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.695967 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.695988 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.799093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.799145 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.799159 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.799178 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.799191 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.901596 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.901670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.901696 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.901725 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:27 crc kubenswrapper[4747]: I0930 18:47:27.901746 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:27Z","lastTransitionTime":"2025-09-30T18:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.004849 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.004906 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.004947 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.004970 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.004989 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.086450 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.086526 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:28 crc kubenswrapper[4747]: E0930 18:47:28.086642 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:28 crc kubenswrapper[4747]: E0930 18:47:28.086748 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.107163 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.107226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.107245 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.107268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.107286 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.210153 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.210214 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.210231 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.210258 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.210276 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.313558 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.313610 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.313619 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.313637 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.313647 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.416760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.416833 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.416868 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.416901 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.416961 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.520646 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.520707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.520730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.520760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.520790 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.623770 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.623830 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.623867 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.623901 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.623967 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.726346 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.726431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.726454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.726489 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.726514 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.829380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.829445 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.829464 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.829488 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.829648 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.933210 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.933261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.933278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.933302 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:28 crc kubenswrapper[4747]: I0930 18:47:28.933319 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:28Z","lastTransitionTime":"2025-09-30T18:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.036701 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.036772 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.036789 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.036817 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.036836 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.086444 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.086484 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:29 crc kubenswrapper[4747]: E0930 18:47:29.086644 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:29 crc kubenswrapper[4747]: E0930 18:47:29.086789 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.140343 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.140415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.140431 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.140457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.140474 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.183477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.183535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.183556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.183579 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.183597 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: E0930 18:47:29.204028 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.209540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.209601 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.209626 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.209656 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.209680 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: E0930 18:47:29.231804 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.237442 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.237486 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.237502 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.237525 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.237544 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: E0930 18:47:29.258294 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.263437 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.263499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.263518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.263543 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.263563 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: E0930 18:47:29.285169 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.289344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.289412 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.289428 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.289454 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.289473 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: E0930 18:47:29.304660 4747 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"37988aed-caa1-4cf6-8704-8dc8a1aec71e\\\",\\\"systemUUID\\\":\\\"654e05b7-6acc-4d21-b8da-ee5f38eb9a9f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:29Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:29 crc kubenswrapper[4747]: E0930 18:47:29.304812 4747 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.306448 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.306501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.306556 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.306579 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.306595 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.409872 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.409963 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.409987 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.410017 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.410039 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.512638 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.512714 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.512735 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.512760 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.512779 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.616049 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.616104 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.616123 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.616148 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.616167 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.719400 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.719469 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.719487 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.719515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.719533 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.823401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.823472 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.823492 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.823518 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.823537 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.927274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.927355 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.927380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.927414 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:29 crc kubenswrapper[4747]: I0930 18:47:29.927440 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:29Z","lastTransitionTime":"2025-09-30T18:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.030976 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.031058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.031083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.031112 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.031129 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.086911 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.086911 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:30 crc kubenswrapper[4747]: E0930 18:47:30.087152 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:30 crc kubenswrapper[4747]: E0930 18:47:30.087270 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.134624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.134685 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.134707 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.134736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.134764 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.237911 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.238028 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.238068 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.238098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.238122 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.340953 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.341011 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.341027 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.341116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.341135 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.443588 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.443632 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.443648 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.443670 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.443691 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.547134 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.547200 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.547219 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.547244 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.547262 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.650697 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.650766 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.650787 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.651040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.651061 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.761897 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.761988 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.762007 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.762031 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.762049 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.865878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.865994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.866020 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.866051 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.866075 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.969785 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.969852 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.969871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.969898 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:30 crc kubenswrapper[4747]: I0930 18:47:30.969918 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:30Z","lastTransitionTime":"2025-09-30T18:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.073166 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.073232 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.073249 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.073274 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.073292 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.087282 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.087393 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:31 crc kubenswrapper[4747]: E0930 18:47:31.087557 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:31 crc kubenswrapper[4747]: E0930 18:47:31.087793 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.120295 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5f7e2f2-8825-4742-9343-b9957b189d23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cb12a8e3bf3a95ae5d983fcd76debe99dd377113331c884dd90043962371fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b059e6ba333b139ba19a1bbdd05050d6a0cc8f043dbd947590d950c3b65147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a58a593e9572e915a3656e9c77b44afc9a648ff044b9801c746c24cf6a96c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397de003e2c837849f7a61c6b0a24ef501bb85c272c128711c403f0d116db41e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6a18e5ece88421ce4253988be2cf68eb165e78f9e8f5625ce9e3edf2c7876a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94c5ec02d6fe43dc52caf38da772907e7107d3dd734f8ba878ef859741b8b9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c1856081db695669070e56aa5b8ef4cbb2cdde663f4773d97b5810a6cf86fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24eb47763b4d3dc81412ad62b9b73e13a2f709a4077e5883f0b8420af737d1e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.138116 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3fce119-955f-405b-bfb3-96aa4b34aef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35db1fd2885288d3747c03640749381d6b53573bc7b7ae9ecb8b740f08d6adf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd4ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pkmxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.156744 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a54fa2-898d-46ef-bb62-be104bf8c2fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c16e922e8c36194d86fc588e1d21b7b638221f734d9df0a5e219afb542066168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31df41c63b9035143bb3f1e1d6804a2312bf02103ccf10d8cab8cf8cb6b2c99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzh8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8r68t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.176294 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.176375 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.176397 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.176427 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.176452 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.178484 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8b2f960-f275-45a1-a079-be89fdd3d03f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d54aa43655f5d5dd79c734d43e2f6958bd96c579bf7e6495648f70247742e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f576058685fcedbc1ea1a8a7106db5512b70f1ce58abb90dc6a1588b12f07985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b9db42345ee26e2e2ae615eef31cd93ef1653c5b26ebdfce105ddcb2ad8300\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ed4d9491c23d0c171012cc1a99ebf29339ff54fd6477438df4ef51e8327b085\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.201278 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"50e73530-98f4-40df-bf19-84eaa5f5ca1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://875dfba2f937da26cf30b27339af9e8bf09d2f7d55fcf1a48461bbbe676da174\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aad5af9f5557fbfb7134324d7e89976fecc09860322b576750d07b0b5dbbd4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8777e7738c26d33f4c868a906a11cbc26d9783eadaabfcc4d809b8e157c664\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://753781c36b7a7030ea48b9bceb4cf74232b392992289f1f7ac912cf7ab162440\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9fec6beb6c5495a7ecc5021d3e5a5be7719700a0aef0b8f7c3258f49f1d43d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"t denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 18:46:21.091535 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0930 18:46:21.091562 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 18:46:21.091581 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091593 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 18:46:21.091604 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 18:46:21.091612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 18:46:21.091620 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 18:46:21.091626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 18:46:21.096422 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3948409360/tls.crt::/tmp/serving-cert-3948409360/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759257964\\\\\\\\\\\\\\\" (2025-09-30 18:46:03 +0000 UTC to 2025-10-30 18:46:04 +0000 UTC (now=2025-09-30 18:46:21.096391108 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096527 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759257975\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759257975\\\\\\\\\\\\\\\" (2025-09-30 17:46:14 +0000 UTC to 2026-09-30 17:46:14 +0000 UTC (now=2025-09-30 18:46:21.096509682 +0000 UTC))\\\\\\\"\\\\nI0930 18:46:21.096541 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0930 18:46:21.096559 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nF0930 18:46:21.096624 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d13b75a081ed68e32dfae8b389dd063c32ba36441adfdf37cd293ab7617da0a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21f1ea98febd72ee84d7a136c085c7e9ea41e87bc4cc21431419a09d6d4b6fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.216304 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.231830 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://366da0913203929f26232866b54a336374470651d9666d4f484bd816da828ac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.245181 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6djlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fbzb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.264466 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96fa3d6-a4fb-495d-a9f6-18040e0f1951\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a176e53d954d796dd96120997067464ad7f415a51d0ad294b1f2dbfddfc69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://045488824ae69725f6d4f1e7b1a64ac477992c6b47fba4401ff80b1ef80e4637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f07768a70b97fe17019f202416d7b3dfc58b1a54996c3ded7a31e7a768d67f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abff02bcec9973216181ad58c22fbe63cf8719a51496ed09917b1f2e4037f098\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.279636 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.279792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.279826 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.279915 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.280018 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.282850 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee9d3648299ebc2d59066ca604f3bead7199bc302bb68a525b00e8d9b8bbd63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d2fe4e14babc1fa2b75c42f1dc29b6fa062de80b0c874edbc6afc98caedd19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.299523 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.316686 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a624d6399ecca48db0062a24643fa8e0bd3b08309e6213a3ddbad6a17f5ca33e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.345083 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ec942cb-ba9d-49cd-b746-b78c0b135bed\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f7be77b3b2ba32c8736e5cd1beb586cd6c288e40ab0a7dac7eb78ba69f0707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5df091c807fda0ff96bd40fe0a7c26908d602edf7700efceca14edf805addd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6978f4e02f189d7e7598fd50672a7974e54edeed0decededece35cbfdbd66c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58280a0abe213c54679d6cdd8548ed72ef60f1db73cb032cc982eb9457ce57cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://998bab48c676b2a27ac035ce28030ecb692cafb6c38eaebf98966790db2b84b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd4bc88415695e827b0aa894a2e68672e9272be354622198fc75d5e78b7cf8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4752314dbd310a13fcc3a17d4822760a501c6f419789aee7f91efb1109147ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhw9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rcwt4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.365036 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:16Z\\\",\\\"message\\\":\\\"formers/factory.go:160\\\\nI0930 18:47:16.172150 6760 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.172392 6760 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.172555 6760 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 18:47:16.173014 6760 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 18:47:16.173050 6760 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 18:47:16.173081 6760 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 18:47:16.173167 6760 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 18:47:16.173185 6760 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 18:47:16.173108 6760 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 18:47:16.173210 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 18:47:16.173217 6760 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 18:47:16.173240 6760 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 18:47:16.173288 6760 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 18:47:16.173313 6760 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:47:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bsls\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pnqjs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.376599 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"973afb09-20b0-46c6-bea9-e822e07c64f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c682b33e155a43e48d8173b084a93df1a6badd45c3c1fc9dbeb8daa9959952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2fe1df8dea520b2f21e42094118f650930f90b12a1912ea514d4f6f9d32df4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T18:46:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.382974 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.383009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.383021 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.383038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.383048 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.390223 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.398084 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2fkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20d6dd78-38e3-4c23-9478-ba7779842d5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2fe073fafdee62333249393e68366e27de137ccddc29a2f5ca3de961db8f141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj88c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2fkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.407754 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4zjq4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34f8698b-7682-4b27-99d0-d72fff30d5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T18:47:09Z\\\",\\\"message\\\":\\\"2025-09-30T18:46:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611\\\\n2025-09-30T18:46:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_527d91fe-3dee-445a-bb8e-560a68dd1611 to /host/opt/cni/bin/\\\\n2025-09-30T18:46:24Z [verbose] multus-daemon started\\\\n2025-09-30T18:46:24Z [verbose] Readiness Indicator file check\\\\n2025-09-30T18:47:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T18:46:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:47:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkqp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4zjq4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.418107 4747 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sdgzs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fbf7d9-81f5-4311-8619-3f0acd2c7fab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T18:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b91d3b6c2e6d5895b8f6014259c864db9361f791afb732abefdddf3b443650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T18:46:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk6bl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T18:46:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sdgzs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T18:47:31Z is after 2025-08-24T17:21:41Z" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.486147 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.486208 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.486228 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.486255 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.486273 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.589134 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.589203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.589227 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.589261 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.589282 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.691821 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.691896 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.691919 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.691978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.692002 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.795844 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.795913 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.795959 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.795985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.796004 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.898996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.899066 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.899085 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.899115 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:31 crc kubenswrapper[4747]: I0930 18:47:31.899135 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:31Z","lastTransitionTime":"2025-09-30T18:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.003009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.003064 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.003082 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.003106 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.003122 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.086746 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.087155 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:32 crc kubenswrapper[4747]: E0930 18:47:32.087469 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:32 crc kubenswrapper[4747]: E0930 18:47:32.087628 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.087917 4747 scope.go:117] "RemoveContainer" containerID="4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5" Sep 30 18:47:32 crc kubenswrapper[4747]: E0930 18:47:32.088221 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.106394 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.106597 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.106791 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.106991 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.107179 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.210319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.210736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.210893 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.211098 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.211232 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.314323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.314381 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.314440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.314464 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.314503 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.418291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.418355 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.418440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.418468 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.418486 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.521674 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.521740 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.521757 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.521780 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.521798 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.625395 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.625451 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.625470 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.625494 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.625511 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.729184 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.729280 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.729323 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.729360 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.729385 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.832268 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.832331 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.832353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.832378 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.832395 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.935438 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.935497 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.935515 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.935540 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:32 crc kubenswrapper[4747]: I0930 18:47:32.935561 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:32Z","lastTransitionTime":"2025-09-30T18:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.038985 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.039067 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.039094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.039125 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.039148 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.087194 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.087293 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:33 crc kubenswrapper[4747]: E0930 18:47:33.087379 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:33 crc kubenswrapper[4747]: E0930 18:47:33.087496 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.142402 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.142474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.142493 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.142519 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.142539 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.249218 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.249271 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.249326 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.249354 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.249373 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.352304 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.352371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.352389 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.352415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.352432 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.455328 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.455396 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.455415 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.455440 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.455457 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.558319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.558384 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.558404 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.558429 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.558447 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.661730 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.661889 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.661910 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.662002 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.662025 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.763761 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.763808 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.763819 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.763837 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.763850 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.866271 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.866309 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.866319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.866332 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.866342 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.969170 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.969239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.969257 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.969284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:33 crc kubenswrapper[4747]: I0930 18:47:33.969304 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:33Z","lastTransitionTime":"2025-09-30T18:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.072233 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.072298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.072317 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.072344 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.072363 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.086501 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.086596 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:34 crc kubenswrapper[4747]: E0930 18:47:34.086683 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:34 crc kubenswrapper[4747]: E0930 18:47:34.086777 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.175996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.176053 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.176071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.176094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.176115 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.279350 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.279413 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.279432 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.279457 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.279476 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.382456 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.382536 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.382566 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.382612 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.382638 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.490036 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.490122 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.490161 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.490197 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.490222 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.593116 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.593192 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.593216 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.593247 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.593270 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.696951 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.697007 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.697023 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.697048 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.697066 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.800425 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.800499 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.800517 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.800545 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.800564 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.903846 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.903895 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.903907 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.903942 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:34 crc kubenswrapper[4747]: I0930 18:47:34.903958 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:34Z","lastTransitionTime":"2025-09-30T18:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.006961 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.007015 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.007035 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.007059 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.007078 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.086921 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.087003 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:35 crc kubenswrapper[4747]: E0930 18:47:35.087163 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:35 crc kubenswrapper[4747]: E0930 18:47:35.087323 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.109962 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.110033 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.110058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.110089 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.110113 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.213154 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.213207 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.213226 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.213259 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.213277 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.316240 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.316298 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.316316 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.316341 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.316359 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.419477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.419555 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.419578 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.419608 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.419708 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.522905 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.522965 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.522977 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.522994 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.523005 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.626351 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.626401 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.626419 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.626441 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.626457 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.729142 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.729225 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.729246 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.729278 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.729296 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.831535 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.831580 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.831594 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.831615 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.831629 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.935203 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.935299 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.935322 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.935353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:35 crc kubenswrapper[4747]: I0930 18:47:35.935374 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:35Z","lastTransitionTime":"2025-09-30T18:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.038982 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.039048 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.039069 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.039094 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.039113 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.086805 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.086805 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:36 crc kubenswrapper[4747]: E0930 18:47:36.086984 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:36 crc kubenswrapper[4747]: E0930 18:47:36.087048 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.142374 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.142429 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.142441 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.142462 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.142477 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.245418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.245474 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.245491 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.245512 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.245529 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.348109 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.348157 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.348201 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.348221 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.348256 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.451878 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.451972 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.451989 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.452009 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.452021 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.555984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.556040 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.556058 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.556083 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.556106 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.658996 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.659061 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.659084 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.659111 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.659132 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.761335 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.761417 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.761442 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.761472 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.761492 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.864291 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.864371 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.864475 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.864503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.864523 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.969779 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.969854 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.969871 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.969899 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:36 crc kubenswrapper[4747]: I0930 18:47:36.969918 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:36Z","lastTransitionTime":"2025-09-30T18:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.072653 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.072748 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.072771 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.072801 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.072820 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.087083 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.087169 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:37 crc kubenswrapper[4747]: E0930 18:47:37.087305 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:37 crc kubenswrapper[4747]: E0930 18:47:37.087493 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.176175 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.176239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.176257 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.176281 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.176298 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.280194 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.280262 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.280284 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.280314 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.280335 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.383482 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.383542 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.383559 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.383582 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.383600 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.486623 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.486678 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.486699 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.486728 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.486751 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.589630 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.589697 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.589717 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.589745 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.589764 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.693757 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.693851 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.693876 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.693959 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.693980 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.796892 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.796978 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.796989 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.797010 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.797026 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.899702 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.899769 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.899783 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.899813 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:37 crc kubenswrapper[4747]: I0930 18:47:37.899829 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:37Z","lastTransitionTime":"2025-09-30T18:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.003319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.003380 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.003399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.003426 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.003445 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.086860 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.087065 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:38 crc kubenswrapper[4747]: E0930 18:47:38.087242 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:38 crc kubenswrapper[4747]: E0930 18:47:38.087555 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.106376 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.106477 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.106501 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.106528 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.106552 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.209229 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.209292 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.209319 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.209348 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.209371 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.312564 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.312624 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.312647 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.312672 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.312690 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.415383 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.415460 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.415482 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.415513 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.415537 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.519105 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.519162 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.519212 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.519239 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.519265 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.622722 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.622792 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.622811 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.622836 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.622859 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.725875 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.725961 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.725984 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.726093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.726133 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.828884 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.829099 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.829139 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.829182 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.829216 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.931399 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.931503 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.931565 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.931593 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:38 crc kubenswrapper[4747]: I0930 18:47:38.931614 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:38Z","lastTransitionTime":"2025-09-30T18:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.034953 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.035018 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.035038 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.035062 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.035080 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:39Z","lastTransitionTime":"2025-09-30T18:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.086904 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:39 crc kubenswrapper[4747]: E0930 18:47:39.087092 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.086919 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:39 crc kubenswrapper[4747]: E0930 18:47:39.087367 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.137711 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.137778 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.137805 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.137834 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.137860 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:39Z","lastTransitionTime":"2025-09-30T18:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.241571 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.241650 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.241668 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.241693 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.241710 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:39Z","lastTransitionTime":"2025-09-30T18:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.344353 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.344418 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.344435 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.344461 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.344479 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:39Z","lastTransitionTime":"2025-09-30T18:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.448005 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.448071 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.448093 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.448120 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.448141 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:39Z","lastTransitionTime":"2025-09-30T18:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.550485 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.550553 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.550572 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.550595 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.550615 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:39Z","lastTransitionTime":"2025-09-30T18:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.592838 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.593168 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.593234 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.593271 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.593292 4747 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T18:47:39Z","lastTransitionTime":"2025-09-30T18:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.657518 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm"] Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.658074 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.661237 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.661460 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.661357 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.665862 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.704120 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.704052579 podStartE2EDuration="44.704052579s" podCreationTimestamp="2025-09-30 18:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:39.703478208 +0000 UTC m=+99.362958392" watchObservedRunningTime="2025-09-30 18:47:39.704052579 +0000 UTC m=+99.363532733" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.753074 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.753017802 podStartE2EDuration="1m18.753017802s" podCreationTimestamp="2025-09-30 18:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:39.729765592 +0000 UTC m=+99.389245736" watchObservedRunningTime="2025-09-30 18:47:39.753017802 +0000 UTC m=+99.412497956" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.829849 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f52c4e4a-4621-42e3-be84-3699339a5a38-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.830095 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f52c4e4a-4621-42e3-be84-3699339a5a38-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.830161 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f52c4e4a-4621-42e3-be84-3699339a5a38-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.830249 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f52c4e4a-4621-42e3-be84-3699339a5a38-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.830319 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f52c4e4a-4621-42e3-be84-3699339a5a38-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.870556 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rcwt4" podStartSLOduration=77.870539102 podStartE2EDuration="1m17.870539102s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:39.816169991 +0000 UTC m=+99.475650195" watchObservedRunningTime="2025-09-30 18:47:39.870539102 +0000 UTC m=+99.530019226" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.886333 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.886317886 podStartE2EDuration="1m18.886317886s" podCreationTimestamp="2025-09-30 18:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:39.886081142 +0000 UTC m=+99.545561246" watchObservedRunningTime="2025-09-30 18:47:39.886317886 +0000 UTC m=+99.545798000" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.929905 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sdgzs" podStartSLOduration=77.929883722 podStartE2EDuration="1m17.929883722s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:39.929276931 +0000 UTC m=+99.588757055" watchObservedRunningTime="2025-09-30 18:47:39.929883722 +0000 UTC m=+99.589363836" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.931481 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f52c4e4a-4621-42e3-be84-3699339a5a38-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.931530 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f52c4e4a-4621-42e3-be84-3699339a5a38-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.931549 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f52c4e4a-4621-42e3-be84-3699339a5a38-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.931595 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f52c4e4a-4621-42e3-be84-3699339a5a38-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.931619 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f52c4e4a-4621-42e3-be84-3699339a5a38-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.931702 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f52c4e4a-4621-42e3-be84-3699339a5a38-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.931985 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f52c4e4a-4621-42e3-be84-3699339a5a38-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.932453 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f52c4e4a-4621-42e3-be84-3699339a5a38-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.938417 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f52c4e4a-4621-42e3-be84-3699339a5a38-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.949590 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.949572737 podStartE2EDuration="25.949572737s" podCreationTimestamp="2025-09-30 18:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:39.949023627 +0000 UTC m=+99.608503761" watchObservedRunningTime="2025-09-30 18:47:39.949572737 +0000 UTC m=+99.609052851" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.953445 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f52c4e4a-4621-42e3-be84-3699339a5a38-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rwnfm\" (UID: \"f52c4e4a-4621-42e3-be84-3699339a5a38\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.978692 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v2fkl" podStartSLOduration=78.978675632 podStartE2EDuration="1m18.978675632s" podCreationTimestamp="2025-09-30 18:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:39.978302906 +0000 UTC m=+99.637783040" watchObservedRunningTime="2025-09-30 18:47:39.978675632 +0000 UTC m=+99.638155736" Sep 30 18:47:39 crc kubenswrapper[4747]: I0930 18:47:39.981023 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.006331 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4zjq4" podStartSLOduration=78.006309011 podStartE2EDuration="1m18.006309011s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:39.998308396 +0000 UTC m=+99.657788520" watchObservedRunningTime="2025-09-30 18:47:40.006309011 +0000 UTC m=+99.665789135" Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.031988 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.031966194 podStartE2EDuration="1m18.031966194s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:40.031579327 +0000 UTC m=+99.691059441" watchObservedRunningTime="2025-09-30 18:47:40.031966194 +0000 UTC m=+99.691446318" Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.044500 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podStartSLOduration=78.044477139 podStartE2EDuration="1m18.044477139s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:40.044386947 +0000 UTC m=+99.703867071" watchObservedRunningTime="2025-09-30 18:47:40.044477139 +0000 UTC m=+99.703957263" Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.061171 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8r68t" podStartSLOduration=78.06115179 podStartE2EDuration="1m18.06115179s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:40.060151602 +0000 UTC m=+99.719631746" watchObservedRunningTime="2025-09-30 18:47:40.06115179 +0000 UTC m=+99.720631914" Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.086441 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.086447 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:40 crc kubenswrapper[4747]: E0930 18:47:40.086560 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:40 crc kubenswrapper[4747]: E0930 18:47:40.086636 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.727545 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" event={"ID":"f52c4e4a-4621-42e3-be84-3699339a5a38","Type":"ContainerStarted","Data":"a714a1e866166d56ee2d8fc7ebe1af0b8aea35e6f4dc9ca563d1f8af03629f1d"} Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.728003 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" event={"ID":"f52c4e4a-4621-42e3-be84-3699339a5a38","Type":"ContainerStarted","Data":"7f9b75f487298fb640350d011b28cdaecf42ae0b1b3bdc76741bef692bdb6888"} Sep 30 18:47:40 crc kubenswrapper[4747]: I0930 18:47:40.752327 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rwnfm" podStartSLOduration=78.752293106 podStartE2EDuration="1m18.752293106s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:40.749767101 +0000 UTC m=+100.409247275" watchObservedRunningTime="2025-09-30 18:47:40.752293106 +0000 UTC m=+100.411773270" Sep 30 18:47:41 crc kubenswrapper[4747]: I0930 18:47:41.086385 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:41 crc kubenswrapper[4747]: I0930 18:47:41.086425 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:41 crc kubenswrapper[4747]: E0930 18:47:41.088210 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:41 crc kubenswrapper[4747]: E0930 18:47:41.088392 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:41 crc kubenswrapper[4747]: I0930 18:47:41.350065 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:41 crc kubenswrapper[4747]: E0930 18:47:41.350296 4747 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:47:41 crc kubenswrapper[4747]: E0930 18:47:41.350438 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs podName:5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8 nodeName:}" failed. No retries permitted until 2025-09-30 18:48:45.350405555 +0000 UTC m=+165.009885709 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs") pod "network-metrics-daemon-fbzb6" (UID: "5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 18:47:42 crc kubenswrapper[4747]: I0930 18:47:42.087217 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:42 crc kubenswrapper[4747]: I0930 18:47:42.087217 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:42 crc kubenswrapper[4747]: E0930 18:47:42.087680 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:42 crc kubenswrapper[4747]: E0930 18:47:42.087849 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:43 crc kubenswrapper[4747]: I0930 18:47:43.086605 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:43 crc kubenswrapper[4747]: I0930 18:47:43.086622 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:43 crc kubenswrapper[4747]: E0930 18:47:43.086853 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:43 crc kubenswrapper[4747]: E0930 18:47:43.087032 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:44 crc kubenswrapper[4747]: I0930 18:47:44.086365 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:44 crc kubenswrapper[4747]: E0930 18:47:44.086499 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:44 crc kubenswrapper[4747]: I0930 18:47:44.086678 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:44 crc kubenswrapper[4747]: E0930 18:47:44.086731 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:45 crc kubenswrapper[4747]: I0930 18:47:45.087094 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:45 crc kubenswrapper[4747]: E0930 18:47:45.087341 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:45 crc kubenswrapper[4747]: I0930 18:47:45.087424 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:45 crc kubenswrapper[4747]: E0930 18:47:45.087546 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:46 crc kubenswrapper[4747]: I0930 18:47:46.086226 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:46 crc kubenswrapper[4747]: I0930 18:47:46.086473 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:46 crc kubenswrapper[4747]: E0930 18:47:46.087008 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:46 crc kubenswrapper[4747]: I0930 18:47:46.087206 4747 scope.go:117] "RemoveContainer" containerID="4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5" Sep 30 18:47:46 crc kubenswrapper[4747]: E0930 18:47:46.087289 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:46 crc kubenswrapper[4747]: E0930 18:47:46.087704 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pnqjs_openshift-ovn-kubernetes(5851f3a5-36f6-4e85-8584-5ce70fda9d7d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" Sep 30 18:47:47 crc kubenswrapper[4747]: I0930 18:47:47.087226 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:47 crc kubenswrapper[4747]: I0930 18:47:47.087259 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:47 crc kubenswrapper[4747]: E0930 18:47:47.087440 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:47 crc kubenswrapper[4747]: E0930 18:47:47.087583 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:48 crc kubenswrapper[4747]: I0930 18:47:48.086808 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:48 crc kubenswrapper[4747]: I0930 18:47:48.087361 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:48 crc kubenswrapper[4747]: E0930 18:47:48.087551 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:48 crc kubenswrapper[4747]: E0930 18:47:48.087806 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:49 crc kubenswrapper[4747]: I0930 18:47:49.087063 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:49 crc kubenswrapper[4747]: E0930 18:47:49.087211 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:49 crc kubenswrapper[4747]: I0930 18:47:49.087350 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:49 crc kubenswrapper[4747]: E0930 18:47:49.087581 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:50 crc kubenswrapper[4747]: I0930 18:47:50.087061 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:50 crc kubenswrapper[4747]: I0930 18:47:50.087137 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:50 crc kubenswrapper[4747]: E0930 18:47:50.087255 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:50 crc kubenswrapper[4747]: E0930 18:47:50.087603 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:51 crc kubenswrapper[4747]: I0930 18:47:51.087129 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:51 crc kubenswrapper[4747]: I0930 18:47:51.087162 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:51 crc kubenswrapper[4747]: E0930 18:47:51.087868 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:51 crc kubenswrapper[4747]: E0930 18:47:51.087983 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:52 crc kubenswrapper[4747]: I0930 18:47:52.086462 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:52 crc kubenswrapper[4747]: I0930 18:47:52.086486 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:52 crc kubenswrapper[4747]: E0930 18:47:52.086635 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:52 crc kubenswrapper[4747]: E0930 18:47:52.086820 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:53 crc kubenswrapper[4747]: I0930 18:47:53.087170 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:53 crc kubenswrapper[4747]: I0930 18:47:53.087306 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:53 crc kubenswrapper[4747]: E0930 18:47:53.087394 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:53 crc kubenswrapper[4747]: E0930 18:47:53.087557 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:54 crc kubenswrapper[4747]: I0930 18:47:54.086532 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:54 crc kubenswrapper[4747]: E0930 18:47:54.087142 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:54 crc kubenswrapper[4747]: I0930 18:47:54.086579 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:54 crc kubenswrapper[4747]: E0930 18:47:54.087413 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:55 crc kubenswrapper[4747]: I0930 18:47:55.086922 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:55 crc kubenswrapper[4747]: I0930 18:47:55.087118 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:55 crc kubenswrapper[4747]: E0930 18:47:55.087276 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:55 crc kubenswrapper[4747]: E0930 18:47:55.087435 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:56 crc kubenswrapper[4747]: I0930 18:47:56.086375 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:56 crc kubenswrapper[4747]: I0930 18:47:56.086416 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:56 crc kubenswrapper[4747]: E0930 18:47:56.086582 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:56 crc kubenswrapper[4747]: E0930 18:47:56.086691 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:56 crc kubenswrapper[4747]: I0930 18:47:56.788182 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/1.log" Sep 30 18:47:56 crc kubenswrapper[4747]: I0930 18:47:56.788825 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/0.log" Sep 30 18:47:56 crc kubenswrapper[4747]: I0930 18:47:56.788863 4747 generic.go:334] "Generic (PLEG): container finished" podID="34f8698b-7682-4b27-99d0-d72fff30d5a8" containerID="f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33" exitCode=1 Sep 30 18:47:56 crc kubenswrapper[4747]: I0930 18:47:56.788898 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zjq4" event={"ID":"34f8698b-7682-4b27-99d0-d72fff30d5a8","Type":"ContainerDied","Data":"f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33"} Sep 30 18:47:56 crc kubenswrapper[4747]: I0930 18:47:56.788960 4747 scope.go:117] "RemoveContainer" containerID="0388b5a3cb3d4255badd095b4c3ee37840d484c2556d651dcb72b9653a8d372c" Sep 30 18:47:56 crc kubenswrapper[4747]: I0930 18:47:56.789450 4747 scope.go:117] "RemoveContainer" containerID="f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33" Sep 30 18:47:56 crc kubenswrapper[4747]: E0930 18:47:56.789651 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-4zjq4_openshift-multus(34f8698b-7682-4b27-99d0-d72fff30d5a8)\"" pod="openshift-multus/multus-4zjq4" podUID="34f8698b-7682-4b27-99d0-d72fff30d5a8" Sep 30 18:47:57 crc kubenswrapper[4747]: I0930 18:47:57.087346 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:57 crc kubenswrapper[4747]: I0930 18:47:57.087476 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:57 crc kubenswrapper[4747]: E0930 18:47:57.087599 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:57 crc kubenswrapper[4747]: E0930 18:47:57.087868 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:57 crc kubenswrapper[4747]: I0930 18:47:57.794520 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/1.log" Sep 30 18:47:58 crc kubenswrapper[4747]: I0930 18:47:58.087175 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:47:58 crc kubenswrapper[4747]: E0930 18:47:58.087364 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:47:58 crc kubenswrapper[4747]: I0930 18:47:58.087648 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:47:58 crc kubenswrapper[4747]: E0930 18:47:58.088221 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:47:58 crc kubenswrapper[4747]: I0930 18:47:58.088714 4747 scope.go:117] "RemoveContainer" containerID="4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5" Sep 30 18:47:58 crc kubenswrapper[4747]: I0930 18:47:58.800165 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/3.log" Sep 30 18:47:58 crc kubenswrapper[4747]: I0930 18:47:58.804030 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerStarted","Data":"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1"} Sep 30 18:47:58 crc kubenswrapper[4747]: I0930 18:47:58.804585 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:47:58 crc kubenswrapper[4747]: I0930 18:47:58.834446 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podStartSLOduration=96.834414433 podStartE2EDuration="1m36.834414433s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:47:58.831655183 +0000 UTC m=+118.491135337" watchObservedRunningTime="2025-09-30 18:47:58.834414433 +0000 UTC m=+118.493894577" Sep 30 18:47:59 crc kubenswrapper[4747]: I0930 18:47:59.088218 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:59 crc kubenswrapper[4747]: I0930 18:47:59.088393 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:47:59 crc kubenswrapper[4747]: E0930 18:47:59.088600 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:47:59 crc kubenswrapper[4747]: E0930 18:47:59.088802 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:47:59 crc kubenswrapper[4747]: I0930 18:47:59.199960 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fbzb6"] Sep 30 18:47:59 crc kubenswrapper[4747]: I0930 18:47:59.807060 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:47:59 crc kubenswrapper[4747]: E0930 18:47:59.807874 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:00 crc kubenswrapper[4747]: I0930 18:48:00.086307 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:00 crc kubenswrapper[4747]: I0930 18:48:00.086335 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:00 crc kubenswrapper[4747]: E0930 18:48:00.086520 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:00 crc kubenswrapper[4747]: E0930 18:48:00.086611 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:01 crc kubenswrapper[4747]: E0930 18:48:01.015390 4747 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 18:48:01 crc kubenswrapper[4747]: I0930 18:48:01.086714 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:01 crc kubenswrapper[4747]: E0930 18:48:01.090066 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:48:01 crc kubenswrapper[4747]: E0930 18:48:01.185386 4747 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 18:48:02 crc kubenswrapper[4747]: I0930 18:48:02.086910 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:02 crc kubenswrapper[4747]: I0930 18:48:02.086963 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:02 crc kubenswrapper[4747]: I0930 18:48:02.087060 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:02 crc kubenswrapper[4747]: E0930 18:48:02.087225 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:02 crc kubenswrapper[4747]: E0930 18:48:02.087379 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:02 crc kubenswrapper[4747]: E0930 18:48:02.087505 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:03 crc kubenswrapper[4747]: I0930 18:48:03.086726 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:03 crc kubenswrapper[4747]: E0930 18:48:03.086948 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:48:04 crc kubenswrapper[4747]: I0930 18:48:04.087047 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:04 crc kubenswrapper[4747]: I0930 18:48:04.087054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:04 crc kubenswrapper[4747]: E0930 18:48:04.087257 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:04 crc kubenswrapper[4747]: I0930 18:48:04.087054 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:04 crc kubenswrapper[4747]: E0930 18:48:04.087410 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:04 crc kubenswrapper[4747]: E0930 18:48:04.087560 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:05 crc kubenswrapper[4747]: I0930 18:48:05.086845 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:05 crc kubenswrapper[4747]: E0930 18:48:05.087358 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:48:06 crc kubenswrapper[4747]: I0930 18:48:06.086372 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:06 crc kubenswrapper[4747]: I0930 18:48:06.086448 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:06 crc kubenswrapper[4747]: I0930 18:48:06.086379 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:06 crc kubenswrapper[4747]: E0930 18:48:06.086557 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:06 crc kubenswrapper[4747]: E0930 18:48:06.086738 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:06 crc kubenswrapper[4747]: E0930 18:48:06.086994 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:06 crc kubenswrapper[4747]: E0930 18:48:06.187060 4747 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 18:48:07 crc kubenswrapper[4747]: I0930 18:48:07.087098 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:07 crc kubenswrapper[4747]: E0930 18:48:07.087291 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:48:08 crc kubenswrapper[4747]: I0930 18:48:08.086269 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:08 crc kubenswrapper[4747]: I0930 18:48:08.086372 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:08 crc kubenswrapper[4747]: E0930 18:48:08.086455 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:08 crc kubenswrapper[4747]: E0930 18:48:08.086560 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:08 crc kubenswrapper[4747]: I0930 18:48:08.086636 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:08 crc kubenswrapper[4747]: E0930 18:48:08.086743 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:09 crc kubenswrapper[4747]: I0930 18:48:09.086494 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:09 crc kubenswrapper[4747]: E0930 18:48:09.086696 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:48:10 crc kubenswrapper[4747]: I0930 18:48:10.087175 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:10 crc kubenswrapper[4747]: I0930 18:48:10.087175 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:10 crc kubenswrapper[4747]: E0930 18:48:10.087422 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:10 crc kubenswrapper[4747]: I0930 18:48:10.087208 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:10 crc kubenswrapper[4747]: E0930 18:48:10.087555 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:10 crc kubenswrapper[4747]: E0930 18:48:10.087795 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:11 crc kubenswrapper[4747]: I0930 18:48:11.086824 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:11 crc kubenswrapper[4747]: E0930 18:48:11.086982 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:48:11 crc kubenswrapper[4747]: E0930 18:48:11.187619 4747 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 18:48:12 crc kubenswrapper[4747]: I0930 18:48:12.086232 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:12 crc kubenswrapper[4747]: I0930 18:48:12.086475 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:12 crc kubenswrapper[4747]: I0930 18:48:12.086258 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:12 crc kubenswrapper[4747]: E0930 18:48:12.086600 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:12 crc kubenswrapper[4747]: E0930 18:48:12.086744 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:12 crc kubenswrapper[4747]: E0930 18:48:12.086956 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:12 crc kubenswrapper[4747]: I0930 18:48:12.087564 4747 scope.go:117] "RemoveContainer" containerID="f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33" Sep 30 18:48:12 crc kubenswrapper[4747]: I0930 18:48:12.860265 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/1.log" Sep 30 18:48:12 crc kubenswrapper[4747]: I0930 18:48:12.860383 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zjq4" event={"ID":"34f8698b-7682-4b27-99d0-d72fff30d5a8","Type":"ContainerStarted","Data":"aa3692f348ec4e24682c7900affd59c3ddc6c6a3de5a1e5a2f45a754c971356d"} Sep 30 18:48:13 crc kubenswrapper[4747]: I0930 18:48:13.087118 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:13 crc kubenswrapper[4747]: E0930 18:48:13.087329 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:48:14 crc kubenswrapper[4747]: I0930 18:48:14.086173 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:14 crc kubenswrapper[4747]: I0930 18:48:14.086200 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:14 crc kubenswrapper[4747]: I0930 18:48:14.086285 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:14 crc kubenswrapper[4747]: E0930 18:48:14.086357 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:14 crc kubenswrapper[4747]: E0930 18:48:14.086518 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:14 crc kubenswrapper[4747]: E0930 18:48:14.086726 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:15 crc kubenswrapper[4747]: I0930 18:48:15.086422 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:15 crc kubenswrapper[4747]: E0930 18:48:15.086614 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 18:48:16 crc kubenswrapper[4747]: I0930 18:48:16.087135 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:16 crc kubenswrapper[4747]: I0930 18:48:16.087209 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:16 crc kubenswrapper[4747]: I0930 18:48:16.087152 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:16 crc kubenswrapper[4747]: E0930 18:48:16.087320 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 18:48:16 crc kubenswrapper[4747]: E0930 18:48:16.087423 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbzb6" podUID="5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8" Sep 30 18:48:16 crc kubenswrapper[4747]: E0930 18:48:16.087595 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 18:48:17 crc kubenswrapper[4747]: I0930 18:48:17.086842 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:17 crc kubenswrapper[4747]: I0930 18:48:17.090337 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 18:48:17 crc kubenswrapper[4747]: I0930 18:48:17.090430 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 18:48:18 crc kubenswrapper[4747]: I0930 18:48:18.086492 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:18 crc kubenswrapper[4747]: I0930 18:48:18.086594 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:18 crc kubenswrapper[4747]: I0930 18:48:18.086504 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:18 crc kubenswrapper[4747]: I0930 18:48:18.088457 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 18:48:18 crc kubenswrapper[4747]: I0930 18:48:18.088529 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 18:48:18 crc kubenswrapper[4747]: I0930 18:48:18.088631 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 18:48:18 crc kubenswrapper[4747]: I0930 18:48:18.090046 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 18:48:19 crc kubenswrapper[4747]: I0930 18:48:19.161107 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.693736 4747 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.745021 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.746086 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7jqj9"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.746602 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.747173 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmpsg"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.747753 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.747802 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.748262 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.749673 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-sn6w4"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.750298 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-74cqx"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.750562 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.750584 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.753868 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.755440 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: W0930 18:48:20.755504 4747 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 18:48:20 crc kubenswrapper[4747]: E0930 18:48:20.755563 4747 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 18:48:20 crc kubenswrapper[4747]: W0930 18:48:20.755733 4747 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Sep 30 18:48:20 crc kubenswrapper[4747]: W0930 18:48:20.755772 4747 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 18:48:20 crc kubenswrapper[4747]: E0930 18:48:20.755792 4747 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 18:48:20 crc kubenswrapper[4747]: E0930 18:48:20.755807 4747 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 18:48:20 crc kubenswrapper[4747]: W0930 18:48:20.755877 4747 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 18:48:20 crc kubenswrapper[4747]: E0930 18:48:20.755901 4747 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.757381 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.757801 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.758212 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.758957 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.759387 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.765497 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.765644 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.766115 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.770370 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.770646 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.771438 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.775104 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.775107 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h7w64"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.775693 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.775895 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.776226 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.776308 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7n9gf"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.777277 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.778349 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.778828 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.779131 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.779195 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.785280 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.785527 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.785793 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.786018 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.785793 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:20 crc kubenswrapper[4747]: W0930 18:48:20.786539 4747 reflector.go:561] object-"openshift-console"/"oauth-serving-cert": failed to list *v1.ConfigMap: configmaps "oauth-serving-cert" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 18:48:20 crc kubenswrapper[4747]: E0930 18:48:20.786585 4747 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"oauth-serving-cert\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"oauth-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.787164 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.787347 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.787654 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.788432 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.794843 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.794907 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: W0930 18:48:20.795090 4747 reflector.go:561] object-"openshift-console"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.795205 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 18:48:20 crc kubenswrapper[4747]: E0930 18:48:20.795228 4747 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.795310 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.795484 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.795705 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.795731 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.795780 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.795920 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.795976 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.794841 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.796137 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.796257 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.796353 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.796273 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.798458 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.800663 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dg925"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.801531 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.802075 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.833367 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.833981 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.835657 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.835818 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-config\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.835905 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-client-ca\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.835991 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkhb9\" (UniqueName: \"kubernetes.io/projected/00a32b13-e38c-424a-8db2-92ea1032208b-kube-api-access-lkhb9\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.836049 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a32b13-e38c-424a-8db2-92ea1032208b-serving-cert\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.836281 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.837820 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-56pzb"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.838801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-56pzb" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.845313 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.845527 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.845605 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.845663 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.845688 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.845840 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.846045 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.846227 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.846324 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.846956 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.847183 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.847336 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.848096 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.848204 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.848377 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.848145 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kzs26"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.849055 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h88d"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.849504 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.849860 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.850034 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.850359 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.851390 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.851612 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.852292 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6tvj9"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.855595 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.855904 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jz9dr"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.855908 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.856269 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.857094 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.857116 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rc5z5"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.857290 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.857463 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.857878 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.857919 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858092 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858264 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858374 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858470 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858568 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858581 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858645 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858870 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.858958 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.859026 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.859201 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.859469 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.859586 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.859639 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.859787 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.859888 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.860183 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.860228 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.861395 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.861533 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.861816 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.861848 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.861942 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862117 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862199 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862219 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862291 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862325 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862401 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862428 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862461 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862495 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862529 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862693 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.862833 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.863136 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.869041 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.869868 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.871189 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.871883 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.872355 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.874414 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.884298 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.888810 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.888829 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.895033 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.895533 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.896250 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.897416 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.897683 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.898036 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.898301 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.898897 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.899114 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fml97"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.899168 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.900002 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.902748 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.907863 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.909778 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.910798 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.911296 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.912257 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kfbg"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.912272 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.913597 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.917757 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmpsg"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.917792 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sn6w4"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.917803 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-74cqx"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.922061 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.922132 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.922912 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.923765 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.924031 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7jqj9"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.925849 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-49glq"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.926421 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.928582 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.930652 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6xh4b"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.931393 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.932234 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.934570 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.934629 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8jxgp"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.936601 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-client-ca\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.936699 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2plxl\" (UniqueName: \"kubernetes.io/projected/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-kube-api-access-2plxl\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.936776 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-service-ca-bundle\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.936852 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-serving-cert\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.936936 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3ae18e-3856-485b-9d77-f788989f86df-config\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.937012 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.937539 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-config\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.937589 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhb9\" (UniqueName: \"kubernetes.io/projected/00a32b13-e38c-424a-8db2-92ea1032208b-kube-api-access-lkhb9\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.937609 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-image-import-ca\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.937717 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/708b7382-ffc3-42e3-ac45-e1776b18473e-images\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.937740 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.937757 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57dtv\" (UniqueName: \"kubernetes.io/projected/d8931532-e4ee-4af7-b128-eddc57597a19-kube-api-access-57dtv\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.937988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938059 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dae6e892-a5c9-493d-825c-49f6181a0f41-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gv745\" (UID: \"dae6e892-a5c9-493d-825c-49f6181a0f41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938090 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chr44\" (UniqueName: \"kubernetes.io/projected/5f90c236-a235-4782-8351-cad3bb90e3fa-kube-api-access-chr44\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938217 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-service-ca\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938223 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-client-ca\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938256 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8q2\" (UniqueName: \"kubernetes.io/projected/ac3ae18e-3856-485b-9d77-f788989f86df-kube-api-access-px8q2\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938296 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4g5\" (UniqueName: \"kubernetes.io/projected/dae6e892-a5c9-493d-825c-49f6181a0f41-kube-api-access-4f4g5\") pod \"cluster-samples-operator-665b6dd947-gv745\" (UID: \"dae6e892-a5c9-493d-825c-49f6181a0f41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938321 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-serving-cert\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938338 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-serving-cert\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938358 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4a1decac-0009-405b-9e01-8669eb06a74e-node-pullsecrets\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938373 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npljh\" (UniqueName: \"kubernetes.io/projected/4a1decac-0009-405b-9e01-8669eb06a74e-kube-api-access-npljh\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938391 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938433 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-encryption-config\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938608 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a1decac-0009-405b-9e01-8669eb06a74e-audit-dir\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938717 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-oauth-config\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938778 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-serving-cert\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938798 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-trusted-ca\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938940 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-config\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.938974 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/114885db-d01b-4a85-8b3a-74585b9c1f13-config\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.939055 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-trusted-ca-bundle\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.939076 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.939208 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/708b7382-ffc3-42e3-ac45-e1776b18473e-config\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.939337 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3ae18e-3856-485b-9d77-f788989f86df-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.939433 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-client-ca\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.939528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a32b13-e38c-424a-8db2-92ea1032208b-serving-cert\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.939619 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.939759 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j754z\" (UniqueName: \"kubernetes.io/projected/708b7382-ffc3-42e3-ac45-e1776b18473e-kube-api-access-j754z\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940068 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s4bk\" (UniqueName: \"kubernetes.io/projected/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-kube-api-access-4s4bk\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940159 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/708b7382-ffc3-42e3-ac45-e1776b18473e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940230 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxbpc\" (UniqueName: \"kubernetes.io/projected/114885db-d01b-4a85-8b3a-74585b9c1f13-kube-api-access-cxbpc\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940299 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-etcd-client\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940366 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-etcd-serving-ca\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940433 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-console-config\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940503 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-config\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940569 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-audit\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940633 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/114885db-d01b-4a85-8b3a-74585b9c1f13-auth-proxy-config\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940700 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/114885db-d01b-4a85-8b3a-74585b9c1f13-machine-approver-tls\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940787 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-config\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940861 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8931532-e4ee-4af7-b128-eddc57597a19-serving-cert\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940993 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941117 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptgfx\" (UniqueName: \"kubernetes.io/projected/05ea429b-cd6a-466f-a2ff-d469a1ed572c-kube-api-access-ptgfx\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940828 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h7w64"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941313 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941378 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-56pzb"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941436 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dg925"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941530 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h88d"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941612 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jz9dr"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941693 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941769 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kzs26"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.940901 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.941919 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-config\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.943621 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.944093 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.945632 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.945871 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.946222 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a32b13-e38c-424a-8db2-92ea1032208b-serving-cert\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.947672 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.948748 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7n9gf"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.949880 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.950980 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.952052 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.953148 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rc5z5"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.953682 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.954392 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.955435 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.956640 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gbqbf"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.957605 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.959033 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.960143 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4p8pb"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.961076 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.961939 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fml97"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.968785 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.970872 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kfbg"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.972231 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.975611 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6tvj9"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.984599 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.986763 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.988557 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.988750 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-49glq"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.989659 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gbqbf"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.991143 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.995375 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4p8pb"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.996739 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wzknm"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.997560 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wzknm"] Sep 30 18:48:20 crc kubenswrapper[4747]: I0930 18:48:20.997611 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wzknm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.009092 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.029354 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041736 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-trusted-ca\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041773 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98s2\" (UniqueName: \"kubernetes.io/projected/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-kube-api-access-n98s2\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041794 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-serving-cert\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041812 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-config\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041828 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/114885db-d01b-4a85-8b3a-74585b9c1f13-config\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041845 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kwk9\" (UniqueName: \"kubernetes.io/projected/0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4-kube-api-access-9kwk9\") pod \"downloads-7954f5f757-56pzb\" (UID: \"0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4\") " pod="openshift-console/downloads-7954f5f757-56pzb" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041866 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-trusted-ca-bundle\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041886 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041902 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/708b7382-ffc3-42e3-ac45-e1776b18473e-config\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041925 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50c8deee-cff0-4e24-ba8b-67116891e3ae-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041952 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f1b807e-a90f-45d0-9f65-04a2c1442c40-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041969 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3ae18e-3856-485b-9d77-f788989f86df-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.041987 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-service-ca\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042004 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-client-ca\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042022 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042037 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0971ea8b-120a-4f19-85d8-a1f349d91c8f-service-ca-bundle\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042054 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1eed6452-36c9-4768-aa14-dfccd421e67c-srv-cert\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042068 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj55t\" (UniqueName: \"kubernetes.io/projected/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-kube-api-access-wj55t\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042086 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042113 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j754z\" (UniqueName: \"kubernetes.io/projected/708b7382-ffc3-42e3-ac45-e1776b18473e-kube-api-access-j754z\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042130 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s4bk\" (UniqueName: \"kubernetes.io/projected/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-kube-api-access-4s4bk\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042150 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/708b7382-ffc3-42e3-ac45-e1776b18473e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042166 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxbpc\" (UniqueName: \"kubernetes.io/projected/114885db-d01b-4a85-8b3a-74585b9c1f13-kube-api-access-cxbpc\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042183 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-config\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042200 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042216 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-etcd-client\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042232 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-etcd-serving-ca\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042255 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-console-config\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042270 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch87x\" (UniqueName: \"kubernetes.io/projected/0971ea8b-120a-4f19-85d8-a1f349d91c8f-kube-api-access-ch87x\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-config\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042306 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-audit\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042321 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/114885db-d01b-4a85-8b3a-74585b9c1f13-auth-proxy-config\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042336 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/114885db-d01b-4a85-8b3a-74585b9c1f13-machine-approver-tls\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042355 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f1b807e-a90f-45d0-9f65-04a2c1442c40-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042372 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042390 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-stats-auth\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042409 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8931532-e4ee-4af7-b128-eddc57597a19-serving-cert\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042425 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042442 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptgfx\" (UniqueName: \"kubernetes.io/projected/05ea429b-cd6a-466f-a2ff-d469a1ed572c-kube-api-access-ptgfx\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042599 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zfx\" (UniqueName: \"kubernetes.io/projected/273449d8-4695-48b9-835e-80756ba8cc1a-kube-api-access-b4zfx\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042654 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-metrics-tls\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042681 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb16f716-b5a6-4885-8b9c-324a0f86c52a-srv-cert\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042699 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-cabundle\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042719 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273449d8-4695-48b9-835e-80756ba8cc1a-proxy-tls\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042740 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5dqq\" (UniqueName: \"kubernetes.io/projected/40c24418-02aa-4b76-aacc-4746107edc63-kube-api-access-s5dqq\") pod \"multus-admission-controller-857f4d67dd-rc5z5\" (UID: \"40c24418-02aa-4b76-aacc-4746107edc63\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042762 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042816 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-config\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042842 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2plxl\" (UniqueName: \"kubernetes.io/projected/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-kube-api-access-2plxl\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042865 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl4nq\" (UniqueName: \"kubernetes.io/projected/1eed6452-36c9-4768-aa14-dfccd421e67c-kube-api-access-fl4nq\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042889 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-service-ca-bundle\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042910 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6sts\" (UniqueName: \"kubernetes.io/projected/cb16f716-b5a6-4885-8b9c-324a0f86c52a-kube-api-access-z6sts\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042913 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-trusted-ca\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-serving-cert\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.042983 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3ae18e-3856-485b-9d77-f788989f86df-config\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043007 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043029 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-config\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26knk\" (UniqueName: \"kubernetes.io/projected/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-kube-api-access-26knk\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043077 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-client\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043111 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50c8deee-cff0-4e24-ba8b-67116891e3ae-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043133 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-config\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043156 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/708b7382-ffc3-42e3-ac45-e1776b18473e-images\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043179 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043205 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57dtv\" (UniqueName: \"kubernetes.io/projected/d8931532-e4ee-4af7-b128-eddc57597a19-kube-api-access-57dtv\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043211 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-config\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043232 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043257 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-image-import-ca\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043280 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043301 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-ca\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043336 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dae6e892-a5c9-493d-825c-49f6181a0f41-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gv745\" (UID: \"dae6e892-a5c9-493d-825c-49f6181a0f41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043357 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-trusted-ca\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043377 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-default-certificate\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043399 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043611 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-trusted-ca-bundle\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043647 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/114885db-d01b-4a85-8b3a-74585b9c1f13-config\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043775 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-console-config\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.043848 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/114885db-d01b-4a85-8b3a-74585b9c1f13-auth-proxy-config\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-metrics-certs\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044076 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpbf4\" (UniqueName: \"kubernetes.io/projected/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-kube-api-access-vpbf4\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044105 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chr44\" (UniqueName: \"kubernetes.io/projected/5f90c236-a235-4782-8351-cad3bb90e3fa-kube-api-access-chr44\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044122 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-config-volume\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044140 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-key\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044166 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-service-ca\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044187 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8q2\" (UniqueName: \"kubernetes.io/projected/ac3ae18e-3856-485b-9d77-f788989f86df-kube-api-access-px8q2\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044201 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/708b7382-ffc3-42e3-ac45-e1776b18473e-config\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044214 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40c24418-02aa-4b76-aacc-4746107edc63-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rc5z5\" (UID: \"40c24418-02aa-4b76-aacc-4746107edc63\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.044331 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb16f716-b5a6-4885-8b9c-324a0f86c52a-profile-collector-cert\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.045042 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-config\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.045460 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.045797 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-client-ca\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.045795 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/708b7382-ffc3-42e3-ac45-e1776b18473e-images\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.046238 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.046475 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-service-ca\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.046372 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-audit\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.046511 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-config\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.046914 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.047154 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-image-import-ca\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.047489 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-etcd-client\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.047549 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8931532-e4ee-4af7-b128-eddc57597a19-service-ca-bundle\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.047597 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f4g5\" (UniqueName: \"kubernetes.io/projected/dae6e892-a5c9-493d-825c-49f6181a0f41-kube-api-access-4f4g5\") pod \"cluster-samples-operator-665b6dd947-gv745\" (UID: \"dae6e892-a5c9-493d-825c-49f6181a0f41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.047883 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-serving-cert\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.047955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-serving-cert\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.048032 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.048134 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/273449d8-4695-48b9-835e-80756ba8cc1a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.048222 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-secret-volume\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.048316 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d4b405d-20dc-4706-93d8-0ce1197b654b-serving-cert\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.048368 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3ae18e-3856-485b-9d77-f788989f86df-config\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.048958 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/708b7382-ffc3-42e3-ac45-e1776b18473e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.049458 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8931532-e4ee-4af7-b128-eddc57597a19-serving-cert\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.049508 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-etcd-serving-ca\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050005 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4a1decac-0009-405b-9e01-8669eb06a74e-node-pullsecrets\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050066 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050066 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npljh\" (UniqueName: \"kubernetes.io/projected/4a1decac-0009-405b-9e01-8669eb06a74e-kube-api-access-npljh\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050128 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4a1decac-0009-405b-9e01-8669eb06a74e-node-pullsecrets\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050130 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f1b807e-a90f-45d0-9f65-04a2c1442c40-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050168 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3ae18e-3856-485b-9d77-f788989f86df-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050210 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050242 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-encryption-config\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050383 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a1decac-0009-405b-9e01-8669eb06a74e-audit-dir\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050427 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50c8deee-cff0-4e24-ba8b-67116891e3ae-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050432 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a1decac-0009-405b-9e01-8669eb06a74e-audit-dir\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050457 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzwqk\" (UniqueName: \"kubernetes.io/projected/0f1b807e-a90f-45d0-9f65-04a2c1442c40-kube-api-access-zzwqk\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050487 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-oauth-config\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050511 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1eed6452-36c9-4768-aa14-dfccd421e67c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.050535 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44mf8\" (UniqueName: \"kubernetes.io/projected/2d4b405d-20dc-4706-93d8-0ce1197b654b-kube-api-access-44mf8\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.051295 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a1decac-0009-405b-9e01-8669eb06a74e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.051467 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dae6e892-a5c9-493d-825c-49f6181a0f41-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gv745\" (UID: \"dae6e892-a5c9-493d-825c-49f6181a0f41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.052059 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-serving-cert\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.052328 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/114885db-d01b-4a85-8b3a-74585b9c1f13-machine-approver-tls\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.053133 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-oauth-config\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.053287 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-serving-cert\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.053365 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-encryption-config\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.055475 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-serving-cert\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.069822 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.089345 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.108975 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.128886 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.149169 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151746 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f1b807e-a90f-45d0-9f65-04a2c1442c40-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151790 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151814 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-stats-auth\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151831 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-cabundle\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zfx\" (UniqueName: \"kubernetes.io/projected/273449d8-4695-48b9-835e-80756ba8cc1a-kube-api-access-b4zfx\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-metrics-tls\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb16f716-b5a6-4885-8b9c-324a0f86c52a-srv-cert\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151915 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5dqq\" (UniqueName: \"kubernetes.io/projected/40c24418-02aa-4b76-aacc-4746107edc63-kube-api-access-s5dqq\") pod \"multus-admission-controller-857f4d67dd-rc5z5\" (UID: \"40c24418-02aa-4b76-aacc-4746107edc63\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151947 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151963 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273449d8-4695-48b9-835e-80756ba8cc1a-proxy-tls\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151978 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-config\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.151999 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl4nq\" (UniqueName: \"kubernetes.io/projected/1eed6452-36c9-4768-aa14-dfccd421e67c-kube-api-access-fl4nq\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152016 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6sts\" (UniqueName: \"kubernetes.io/projected/cb16f716-b5a6-4885-8b9c-324a0f86c52a-kube-api-access-z6sts\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152040 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-client\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152057 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26knk\" (UniqueName: \"kubernetes.io/projected/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-kube-api-access-26knk\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152519 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50c8deee-cff0-4e24-ba8b-67116891e3ae-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152615 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-config\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152687 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152784 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-ca\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152838 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-default-certificate\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152894 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-trusted-ca\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.152981 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-metrics-certs\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153032 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpbf4\" (UniqueName: \"kubernetes.io/projected/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-kube-api-access-vpbf4\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153105 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153165 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-config-volume\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153216 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-config\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153228 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-key\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153305 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40c24418-02aa-4b76-aacc-4746107edc63-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rc5z5\" (UID: \"40c24418-02aa-4b76-aacc-4746107edc63\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153336 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb16f716-b5a6-4885-8b9c-324a0f86c52a-profile-collector-cert\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153387 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153416 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/273449d8-4695-48b9-835e-80756ba8cc1a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153438 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-secret-volume\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153459 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d4b405d-20dc-4706-93d8-0ce1197b654b-serving-cert\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153496 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f1b807e-a90f-45d0-9f65-04a2c1442c40-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153522 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50c8deee-cff0-4e24-ba8b-67116891e3ae-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153542 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzwqk\" (UniqueName: \"kubernetes.io/projected/0f1b807e-a90f-45d0-9f65-04a2c1442c40-kube-api-access-zzwqk\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153560 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1eed6452-36c9-4768-aa14-dfccd421e67c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153577 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44mf8\" (UniqueName: \"kubernetes.io/projected/2d4b405d-20dc-4706-93d8-0ce1197b654b-kube-api-access-44mf8\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153614 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n98s2\" (UniqueName: \"kubernetes.io/projected/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-kube-api-access-n98s2\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153641 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kwk9\" (UniqueName: \"kubernetes.io/projected/0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4-kube-api-access-9kwk9\") pod \"downloads-7954f5f757-56pzb\" (UID: \"0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4\") " pod="openshift-console/downloads-7954f5f757-56pzb" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153663 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50c8deee-cff0-4e24-ba8b-67116891e3ae-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153684 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f1b807e-a90f-45d0-9f65-04a2c1442c40-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153720 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-service-ca\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153747 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153763 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0971ea8b-120a-4f19-85d8-a1f349d91c8f-service-ca-bundle\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153781 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1eed6452-36c9-4768-aa14-dfccd421e67c-srv-cert\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153781 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-config\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153808 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj55t\" (UniqueName: \"kubernetes.io/projected/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-kube-api-access-wj55t\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153906 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-config\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153959 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.153999 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch87x\" (UniqueName: \"kubernetes.io/projected/0971ea8b-120a-4f19-85d8-a1f349d91c8f-kube-api-access-ch87x\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.154530 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f1b807e-a90f-45d0-9f65-04a2c1442c40-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.154667 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-service-ca\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.154864 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/273449d8-4695-48b9-835e-80756ba8cc1a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.154994 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.155014 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-ca\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.157183 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d4b405d-20dc-4706-93d8-0ce1197b654b-etcd-client\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.158542 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.158737 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d4b405d-20dc-4706-93d8-0ce1197b654b-serving-cert\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.159844 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f1b807e-a90f-45d0-9f65-04a2c1442c40-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.169291 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.190380 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.202292 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.209042 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.229437 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.250214 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.259416 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40c24418-02aa-4b76-aacc-4746107edc63-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rc5z5\" (UID: \"40c24418-02aa-4b76-aacc-4746107edc63\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.269623 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.289713 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.308504 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.330227 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.350046 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.371010 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.390121 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.430145 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.437500 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb16f716-b5a6-4885-8b9c-324a0f86c52a-profile-collector-cert\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.440397 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1eed6452-36c9-4768-aa14-dfccd421e67c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.440801 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-secret-volume\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.459472 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.464168 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-trusted-ca\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.470355 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.480901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1eed6452-36c9-4768-aa14-dfccd421e67c-srv-cert\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.490909 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.510181 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.530187 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.549454 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.557775 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-metrics-tls\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.570127 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.590583 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.612076 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.629726 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.649303 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.670092 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.677114 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50c8deee-cff0-4e24-ba8b-67116891e3ae-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.690098 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.695918 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50c8deee-cff0-4e24-ba8b-67116891e3ae-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.709963 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.715304 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb16f716-b5a6-4885-8b9c-324a0f86c52a-srv-cert\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.730130 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.749197 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.769185 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.789433 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.810171 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.816191 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-config-volume\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.829656 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.849031 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.869394 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.890099 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.908439 4747 request.go:700] Waited for 1.006663733s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.910969 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.931031 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.949577 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.970136 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.978331 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.990351 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 18:48:21 crc kubenswrapper[4747]: I0930 18:48:21.994761 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-config\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.009397 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.028828 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.043062 4747 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.043127 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-serving-cert podName:4a1decac-0009-405b-9e01-8669eb06a74e nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.543108536 +0000 UTC m=+142.202588640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-serving-cert") pod "apiserver-76f77b778f-7jqj9" (UID: "4a1decac-0009-405b-9e01-8669eb06a74e") : failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.043393 4747 configmap.go:193] Couldn't get configMap openshift-console/oauth-serving-cert: failed to sync configmap cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.043453 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert podName:5f90c236-a235-4782-8351-cad3bb90e3fa nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.543444827 +0000 UTC m=+142.202924941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "oauth-serving-cert" (UniqueName: "kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert") pod "console-f9d7485db-sn6w4" (UID: "5f90c236-a235-4782-8351-cad3bb90e3fa") : failed to sync configmap cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.045738 4747 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.045882 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert podName:05ea429b-cd6a-466f-a2ff-d469a1ed572c nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.545865017 +0000 UTC m=+142.205345141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert") pod "controller-manager-879f6c89f-nmpsg" (UID: "05ea429b-cd6a-466f-a2ff-d469a1ed572c") : failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.048377 4747 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.048549 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config podName:05ea429b-cd6a-466f-a2ff-d469a1ed572c nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.548535064 +0000 UTC m=+142.208015188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config") pod "controller-manager-879f6c89f-nmpsg" (UID: "05ea429b-cd6a-466f-a2ff-d469a1ed572c") : failed to sync configmap cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.049824 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.069642 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.091412 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.120825 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.130213 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.149360 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.153261 4747 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.153520 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-cabundle podName:e882c7b3-ab9f-422d-acf9-da9234ecbc6a nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.653490109 +0000 UTC m=+142.312970253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-cabundle") pod "service-ca-9c57cc56f-49glq" (UID: "e882c7b3-ab9f-422d-acf9-da9234ecbc6a") : failed to sync configmap cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.153295 4747 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.153793 4747 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.153344 4747 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.153915 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-metrics-certs podName:0971ea8b-120a-4f19-85d8-a1f349d91c8f nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.653885262 +0000 UTC m=+142.313365416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-metrics-certs") pod "router-default-5444994796-6xh4b" (UID: "0971ea8b-120a-4f19-85d8-a1f349d91c8f") : failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.153385 4747 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.154011 4747 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.154063 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-stats-auth podName:0971ea8b-120a-4f19-85d8-a1f349d91c8f nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.654019386 +0000 UTC m=+142.313499580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-stats-auth") pod "router-default-5444994796-6xh4b" (UID: "0971ea8b-120a-4f19-85d8-a1f349d91c8f") : failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.154148 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-default-certificate podName:0971ea8b-120a-4f19-85d8-a1f349d91c8f nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.654126209 +0000 UTC m=+142.313606353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-default-certificate") pod "router-default-5444994796-6xh4b" (UID: "0971ea8b-120a-4f19-85d8-a1f349d91c8f") : failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.154176 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-key podName:e882c7b3-ab9f-422d-acf9-da9234ecbc6a nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.654161561 +0000 UTC m=+142.313641715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-key") pod "service-ca-9c57cc56f-49glq" (UID: "e882c7b3-ab9f-422d-acf9-da9234ecbc6a") : failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.154570 4747 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.154758 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/273449d8-4695-48b9-835e-80756ba8cc1a-proxy-tls podName:273449d8-4695-48b9-835e-80756ba8cc1a nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.654580924 +0000 UTC m=+142.314061108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/273449d8-4695-48b9-835e-80756ba8cc1a-proxy-tls") pod "machine-config-controller-84d6567774-r8c9m" (UID: "273449d8-4695-48b9-835e-80756ba8cc1a") : failed to sync secret cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: E0930 18:48:22.154981 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0971ea8b-120a-4f19-85d8-a1f349d91c8f-service-ca-bundle podName:0971ea8b-120a-4f19-85d8-a1f349d91c8f nodeName:}" failed. No retries permitted until 2025-09-30 18:48:22.654918375 +0000 UTC m=+142.314398539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0971ea8b-120a-4f19-85d8-a1f349d91c8f-service-ca-bundle") pod "router-default-5444994796-6xh4b" (UID: "0971ea8b-120a-4f19-85d8-a1f349d91c8f") : failed to sync configmap cache: timed out waiting for the condition Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.169749 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.190493 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.209194 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.230437 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.249415 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.270263 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.289512 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.309438 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.330262 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.350227 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.370666 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.390447 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.410115 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.429604 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.451116 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.470321 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.490029 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.511054 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.530012 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.577272 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhb9\" (UniqueName: \"kubernetes.io/projected/00a32b13-e38c-424a-8db2-92ea1032208b-kube-api-access-lkhb9\") pod \"route-controller-manager-6576b87f9c-vcdjt\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.582253 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.582493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.582727 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-serving-cert\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.582796 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.587778 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.590321 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.610688 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.630265 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.649755 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.670727 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.683486 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0971ea8b-120a-4f19-85d8-a1f349d91c8f-service-ca-bundle\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.686182 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-stats-auth\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.686306 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-cabundle\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.686347 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273449d8-4695-48b9-835e-80756ba8cc1a-proxy-tls\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.686524 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-default-certificate\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.686575 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-metrics-certs\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.686643 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-key\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.691759 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-cabundle\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.694745 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.695280 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/273449d8-4695-48b9-835e-80756ba8cc1a-proxy-tls\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.695872 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-signing-key\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.698355 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0971ea8b-120a-4f19-85d8-a1f349d91c8f-service-ca-bundle\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.706901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-stats-auth\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.707754 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-default-certificate\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.711183 4747 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.711759 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0971ea8b-120a-4f19-85d8-a1f349d91c8f-metrics-certs\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.730515 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.750413 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.773229 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.790303 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.807533 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt"] Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.810610 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 18:48:22 crc kubenswrapper[4747]: W0930 18:48:22.815543 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a32b13_e38c_424a_8db2_92ea1032208b.slice/crio-4c5fee074aecd396eff8d9edf8d1c6defa98e2d83a2a980517d203ea4f0ad11f WatchSource:0}: Error finding container 4c5fee074aecd396eff8d9edf8d1c6defa98e2d83a2a980517d203ea4f0ad11f: Status 404 returned error can't find the container with id 4c5fee074aecd396eff8d9edf8d1c6defa98e2d83a2a980517d203ea4f0ad11f Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.829718 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.870371 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j754z\" (UniqueName: \"kubernetes.io/projected/708b7382-ffc3-42e3-ac45-e1776b18473e-kube-api-access-j754z\") pod \"machine-api-operator-5694c8668f-7n9gf\" (UID: \"708b7382-ffc3-42e3-ac45-e1776b18473e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.885701 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s4bk\" (UniqueName: \"kubernetes.io/projected/3d3f0aa4-95db-4128-a2ec-1c59eb91c18e-kube-api-access-4s4bk\") pod \"openshift-config-operator-7777fb866f-74cqx\" (UID: \"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.905784 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" event={"ID":"00a32b13-e38c-424a-8db2-92ea1032208b","Type":"ContainerStarted","Data":"4c5fee074aecd396eff8d9edf8d1c6defa98e2d83a2a980517d203ea4f0ad11f"} Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.910682 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxbpc\" (UniqueName: \"kubernetes.io/projected/114885db-d01b-4a85-8b3a-74585b9c1f13-kube-api-access-cxbpc\") pod \"machine-approver-56656f9798-5h8pq\" (UID: \"114885db-d01b-4a85-8b3a-74585b9c1f13\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.927955 4747 request.go:700] Waited for 1.882683669s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.954006 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57dtv\" (UniqueName: \"kubernetes.io/projected/d8931532-e4ee-4af7-b128-eddc57597a19-kube-api-access-57dtv\") pod \"authentication-operator-69f744f599-dg925\" (UID: \"d8931532-e4ee-4af7-b128-eddc57597a19\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:22 crc kubenswrapper[4747]: I0930 18:48:22.971687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8q2\" (UniqueName: \"kubernetes.io/projected/ac3ae18e-3856-485b-9d77-f788989f86df-kube-api-access-px8q2\") pod \"openshift-apiserver-operator-796bbdcf4f-44z9c\" (UID: \"ac3ae18e-3856-485b-9d77-f788989f86df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.002992 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2plxl\" (UniqueName: \"kubernetes.io/projected/1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d-kube-api-access-2plxl\") pod \"console-operator-58897d9998-h7w64\" (UID: \"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d\") " pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.032507 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.033824 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f4g5\" (UniqueName: \"kubernetes.io/projected/dae6e892-a5c9-493d-825c-49f6181a0f41-kube-api-access-4f4g5\") pod \"cluster-samples-operator-665b6dd947-gv745\" (UID: \"dae6e892-a5c9-493d-825c-49f6181a0f41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.045838 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npljh\" (UniqueName: \"kubernetes.io/projected/4a1decac-0009-405b-9e01-8669eb06a74e-kube-api-access-npljh\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.047474 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.057227 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.062480 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.067321 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e603a69d-3bfb-4cc5-bf21-b958b5f17b78-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2mxtk\" (UID: \"e603a69d-3bfb-4cc5-bf21-b958b5f17b78\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.072154 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.087280 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26knk\" (UniqueName: \"kubernetes.io/projected/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-kube-api-access-26knk\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:23 crc kubenswrapper[4747]: W0930 18:48:23.102620 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114885db_d01b_4a85_8b3a_74585b9c1f13.slice/crio-12a00a7310b7e528b765a4b3e26f71e6fca49319a8b8be66f392073b7833a50f WatchSource:0}: Error finding container 12a00a7310b7e528b765a4b3e26f71e6fca49319a8b8be66f392073b7833a50f: Status 404 returned error can't find the container with id 12a00a7310b7e528b765a4b3e26f71e6fca49319a8b8be66f392073b7833a50f Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.110962 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl4nq\" (UniqueName: \"kubernetes.io/projected/1eed6452-36c9-4768-aa14-dfccd421e67c-kube-api-access-fl4nq\") pod \"olm-operator-6b444d44fb-lkqvn\" (UID: \"1eed6452-36c9-4768-aa14-dfccd421e67c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.133138 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6sts\" (UniqueName: \"kubernetes.io/projected/cb16f716-b5a6-4885-8b9c-324a0f86c52a-kube-api-access-z6sts\") pod \"catalog-operator-68c6474976-wlwjd\" (UID: \"cb16f716-b5a6-4885-8b9c-324a0f86c52a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.136199 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.155539 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zfx\" (UniqueName: \"kubernetes.io/projected/273449d8-4695-48b9-835e-80756ba8cc1a-kube-api-access-b4zfx\") pod \"machine-config-controller-84d6567774-r8c9m\" (UID: \"273449d8-4695-48b9-835e-80756ba8cc1a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.169269 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97bb37a5-2fa5-4580-8ba1-0ce90e3584cf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8cbk9\" (UID: \"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.185427 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5dqq\" (UniqueName: \"kubernetes.io/projected/40c24418-02aa-4b76-aacc-4746107edc63-kube-api-access-s5dqq\") pod \"multus-admission-controller-857f4d67dd-rc5z5\" (UID: \"40c24418-02aa-4b76-aacc-4746107edc63\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.218800 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.218841 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.239012 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.239392 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce1b7795-da9d-4275-b9c5-f0dfe3cc9917-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z45qm\" (UID: \"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.254467 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/50c8deee-cff0-4e24-ba8b-67116891e3ae-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wzrrm\" (UID: \"50c8deee-cff0-4e24-ba8b-67116891e3ae\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.259619 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.260489 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpbf4\" (UniqueName: \"kubernetes.io/projected/e882c7b3-ab9f-422d-acf9-da9234ecbc6a-kube-api-access-vpbf4\") pod \"service-ca-9c57cc56f-49glq\" (UID: \"e882c7b3-ab9f-422d-acf9-da9234ecbc6a\") " pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.277134 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.281881 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44mf8\" (UniqueName: \"kubernetes.io/projected/2d4b405d-20dc-4706-93d8-0ce1197b654b-kube-api-access-44mf8\") pod \"etcd-operator-b45778765-6tvj9\" (UID: \"2d4b405d-20dc-4706-93d8-0ce1197b654b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.283861 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj55t\" (UniqueName: \"kubernetes.io/projected/f9ff24dd-26ea-4c4a-8c7b-171092fb7666-kube-api-access-wj55t\") pod \"openshift-controller-manager-operator-756b6f6bc6-4vzsx\" (UID: \"f9ff24dd-26ea-4c4a-8c7b-171092fb7666\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.302325 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.305384 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98s2\" (UniqueName: \"kubernetes.io/projected/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-kube-api-access-n98s2\") pod \"collect-profiles-29320965-5lgtg\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.312967 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.313228 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-49glq" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.324996 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzwqk\" (UniqueName: \"kubernetes.io/projected/0f1b807e-a90f-45d0-9f65-04a2c1442c40-kube-api-access-zzwqk\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.349145 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-74cqx"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.363772 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f1b807e-a90f-45d0-9f65-04a2c1442c40-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pbzp4\" (UID: \"0f1b807e-a90f-45d0-9f65-04a2c1442c40\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.389315 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch87x\" (UniqueName: \"kubernetes.io/projected/0971ea8b-120a-4f19-85d8-a1f349d91c8f-kube-api-access-ch87x\") pod \"router-default-5444994796-6xh4b\" (UID: \"0971ea8b-120a-4f19-85d8-a1f349d91c8f\") " pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.400802 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.413559 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.426137 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.426358 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.431582 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.433583 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.441846 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1decac-0009-405b-9e01-8669eb06a74e-serving-cert\") pod \"apiserver-76f77b778f-7jqj9\" (UID: \"4a1decac-0009-405b-9e01-8669eb06a74e\") " pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.448814 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.461644 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.474148 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.486665 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chr44\" (UniqueName: \"kubernetes.io/projected/5f90c236-a235-4782-8351-cad3bb90e3fa-kube-api-access-chr44\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.486863 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kwk9\" (UniqueName: \"kubernetes.io/projected/0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4-kube-api-access-9kwk9\") pod \"downloads-7954f5f757-56pzb\" (UID: \"0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4\") " pod="openshift-console/downloads-7954f5f757-56pzb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.489241 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.500619 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert\") pod \"console-f9d7485db-sn6w4\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.501182 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.511561 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.525232 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.536358 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptgfx\" (UniqueName: \"kubernetes.io/projected/05ea429b-cd6a-466f-a2ff-d469a1ed572c-kube-api-access-ptgfx\") pod \"controller-manager-879f6c89f-nmpsg\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.547868 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.579978 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.583775 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7n9gf"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.586569 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.615098 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.618179 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625224 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff7gv\" (UniqueName: \"kubernetes.io/projected/d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2-kube-api-access-ff7gv\") pod \"package-server-manager-789f6589d5-x6t8n\" (UID: \"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625264 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-policies\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625287 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-encryption-config\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625303 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5f92557-8211-484e-b0f6-c7cd21998ca6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625317 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f92557-8211-484e-b0f6-c7cd21998ca6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625334 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5fwm\" (UniqueName: \"kubernetes.io/projected/d526e7bd-199d-4d9f-8826-6eee8fc0fa8d-kube-api-access-q5fwm\") pod \"control-plane-machine-set-operator-78cbb6b69f-sq5x4\" (UID: \"d526e7bd-199d-4d9f-8826-6eee8fc0fa8d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625351 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v8f2\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-kube-api-access-6v8f2\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625389 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsp57\" (UniqueName: \"kubernetes.io/projected/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-kube-api-access-nsp57\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625419 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07103b35-ea08-4d06-b981-d04736a21d17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625437 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmgn\" (UniqueName: \"kubernetes.io/projected/d1185a87-4827-4668-a07c-fb227c9c4213-kube-api-access-8cmgn\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625455 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625476 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-registry-tls\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625490 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e65d8d96-ac46-4fa8-b940-55290af981d5-proxy-tls\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625514 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x6t8n\" (UID: \"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625532 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625547 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwdk\" (UniqueName: \"kubernetes.io/projected/e65d8d96-ac46-4fa8-b940-55290af981d5-kube-api-access-5jwdk\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625562 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-dir\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625576 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/288d90b3-91c8-4768-8920-939d1e515807-audit-dir\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625593 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e65d8d96-ac46-4fa8-b940-55290af981d5-images\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625607 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625623 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625639 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b57ac9-657f-4e92-a71c-a99b52c3c79c-config\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625660 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625678 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk9wd\" (UniqueName: \"kubernetes.io/projected/66d8c362-4cc5-48b4-898d-1ab38933bcd7-kube-api-access-rk9wd\") pod \"migrator-59844c95c7-z4ch7\" (UID: \"66d8c362-4cc5-48b4-898d-1ab38933bcd7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625696 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhzx\" (UniqueName: \"kubernetes.io/projected/68b57ac9-657f-4e92-a71c-a99b52c3c79c-kube-api-access-jdhzx\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625711 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vd4c\" (UniqueName: \"kubernetes.io/projected/d5f92557-8211-484e-b0f6-c7cd21998ca6-kube-api-access-5vd4c\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625727 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625741 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625762 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625776 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r7z7\" (UniqueName: \"kubernetes.io/projected/288d90b3-91c8-4768-8920-939d1e515807-kube-api-access-9r7z7\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625792 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07103b35-ea08-4d06-b981-d04736a21d17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625808 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1185a87-4827-4668-a07c-fb227c9c4213-apiservice-cert\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625823 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-bound-sa-token\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625836 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625852 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625866 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-audit-policies\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625880 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625901 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625943 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d526e7bd-199d-4d9f-8826-6eee8fc0fa8d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sq5x4\" (UID: \"d526e7bd-199d-4d9f-8826-6eee8fc0fa8d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625963 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6hd\" (UniqueName: \"kubernetes.io/projected/0007394a-7089-407b-ad0f-25c9794ccefe-kube-api-access-2q6hd\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.625981 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7c7\" (UniqueName: \"kubernetes.io/projected/0a4e1fbe-d463-46af-8e64-223fd290f89c-kube-api-access-gg7c7\") pod \"dns-operator-744455d44c-jz9dr\" (UID: \"0a4e1fbe-d463-46af-8e64-223fd290f89c\") " pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.626001 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1185a87-4827-4668-a07c-fb227c9c4213-webhook-cert\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.626017 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.626034 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.626111 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d1185a87-4827-4668-a07c-fb227c9c4213-tmpfs\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.626508 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68b57ac9-657f-4e92-a71c-a99b52c3c79c-serving-cert\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.626546 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-etcd-client\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.626564 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-serving-cert\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: E0930 18:48:23.626838 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.126822556 +0000 UTC m=+143.786302670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.627099 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-registry-certificates\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.627120 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a4e1fbe-d463-46af-8e64-223fd290f89c-metrics-tls\") pod \"dns-operator-744455d44c-jz9dr\" (UID: \"0a4e1fbe-d463-46af-8e64-223fd290f89c\") " pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.627159 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.627175 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-trusted-ca\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.627192 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e65d8d96-ac46-4fa8-b940-55290af981d5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.627253 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.664476 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dg925"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.683694 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.690309 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-56pzb" Sep 30 18:48:23 crc kubenswrapper[4747]: W0930 18:48:23.696959 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3ae18e_3856_485b_9d77_f788989f86df.slice/crio-c214908b76c0d4c0cab374310a3100dad39a2bc90e0c7fee0efe0b7e19e30298 WatchSource:0}: Error finding container c214908b76c0d4c0cab374310a3100dad39a2bc90e0c7fee0efe0b7e19e30298: Status 404 returned error can't find the container with id c214908b76c0d4c0cab374310a3100dad39a2bc90e0c7fee0efe0b7e19e30298 Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.720903 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.728976 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729218 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5fwm\" (UniqueName: \"kubernetes.io/projected/d526e7bd-199d-4d9f-8826-6eee8fc0fa8d-kube-api-access-q5fwm\") pod \"control-plane-machine-set-operator-78cbb6b69f-sq5x4\" (UID: \"d526e7bd-199d-4d9f-8826-6eee8fc0fa8d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729271 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52297ca2-f082-428f-bcb3-aaf6d37e354f-metrics-tls\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729297 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v8f2\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-kube-api-access-6v8f2\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: E0930 18:48:23.729387 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.229364011 +0000 UTC m=+143.888844125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729510 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsp57\" (UniqueName: \"kubernetes.io/projected/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-kube-api-access-nsp57\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729536 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07103b35-ea08-4d06-b981-d04736a21d17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729783 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-csi-data-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmgn\" (UniqueName: \"kubernetes.io/projected/d1185a87-4827-4668-a07c-fb227c9c4213-kube-api-access-8cmgn\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729954 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-registry-tls\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.729979 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e65d8d96-ac46-4fa8-b940-55290af981d5-proxy-tls\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730000 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-mountpoint-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730047 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x6t8n\" (UID: \"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730085 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730102 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwdk\" (UniqueName: \"kubernetes.io/projected/e65d8d96-ac46-4fa8-b940-55290af981d5-kube-api-access-5jwdk\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730118 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-dir\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730145 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/288d90b3-91c8-4768-8920-939d1e515807-audit-dir\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730182 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e65d8d96-ac46-4fa8-b940-55290af981d5-images\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730218 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ptfc\" (UniqueName: \"kubernetes.io/projected/f6ff4b11-2578-4861-99ad-355202e9f320-kube-api-access-4ptfc\") pod \"ingress-canary-wzknm\" (UID: \"f6ff4b11-2578-4861-99ad-355202e9f320\") " pod="openshift-ingress-canary/ingress-canary-wzknm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730243 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730258 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730295 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b57ac9-657f-4e92-a71c-a99b52c3c79c-config\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730318 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730337 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk9wd\" (UniqueName: \"kubernetes.io/projected/66d8c362-4cc5-48b4-898d-1ab38933bcd7-kube-api-access-rk9wd\") pod \"migrator-59844c95c7-z4ch7\" (UID: \"66d8c362-4cc5-48b4-898d-1ab38933bcd7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730431 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-registration-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730486 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhzx\" (UniqueName: \"kubernetes.io/projected/68b57ac9-657f-4e92-a71c-a99b52c3c79c-kube-api-access-jdhzx\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730520 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vd4c\" (UniqueName: \"kubernetes.io/projected/d5f92557-8211-484e-b0f6-c7cd21998ca6-kube-api-access-5vd4c\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730543 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730571 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730606 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730646 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r7z7\" (UniqueName: \"kubernetes.io/projected/288d90b3-91c8-4768-8920-939d1e515807-kube-api-access-9r7z7\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730700 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6ff4b11-2578-4861-99ad-355202e9f320-cert\") pod \"ingress-canary-wzknm\" (UID: \"f6ff4b11-2578-4861-99ad-355202e9f320\") " pod="openshift-ingress-canary/ingress-canary-wzknm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.730762 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07103b35-ea08-4d06-b981-d04736a21d17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.731156 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.731431 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07103b35-ea08-4d06-b981-d04736a21d17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.732035 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.732440 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/288d90b3-91c8-4768-8920-939d1e515807-audit-dir\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.732581 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-dir\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.732683 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.733568 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1185a87-4827-4668-a07c-fb227c9c4213-apiservice-cert\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.733917 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-bound-sa-token\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.733978 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.734467 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b57ac9-657f-4e92-a71c-a99b52c3c79c-config\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.734559 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x6t8n\" (UID: \"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.734835 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.735035 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.735074 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-plugins-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.735314 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-audit-policies\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.735347 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.735376 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbkdm\" (UniqueName: \"kubernetes.io/projected/3475dad0-af15-4cbe-b43c-640fcebd0873-kube-api-access-hbkdm\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.735460 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: E0930 18:48:23.735889 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.235873814 +0000 UTC m=+143.895353928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.736084 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d526e7bd-199d-4d9f-8826-6eee8fc0fa8d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sq5x4\" (UID: \"d526e7bd-199d-4d9f-8826-6eee8fc0fa8d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.736149 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7c7\" (UniqueName: \"kubernetes.io/projected/0a4e1fbe-d463-46af-8e64-223fd290f89c-kube-api-access-gg7c7\") pod \"dns-operator-744455d44c-jz9dr\" (UID: \"0a4e1fbe-d463-46af-8e64-223fd290f89c\") " pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.736847 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e65d8d96-ac46-4fa8-b940-55290af981d5-proxy-tls\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.737211 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.737438 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.737620 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6hd\" (UniqueName: \"kubernetes.io/projected/0007394a-7089-407b-ad0f-25c9794ccefe-kube-api-access-2q6hd\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.737738 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.737850 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1185a87-4827-4668-a07c-fb227c9c4213-webhook-cert\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.737884 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.737962 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.738095 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3774af40-0437-4166-abb6-f283ac1e97e5-node-bootstrap-token\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.738203 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.738256 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d1185a87-4827-4668-a07c-fb227c9c4213-tmpfs\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.738640 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d1185a87-4827-4668-a07c-fb227c9c4213-tmpfs\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.739675 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-socket-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.739713 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68b57ac9-657f-4e92-a71c-a99b52c3c79c-serving-cert\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.739742 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-229pq\" (UniqueName: \"kubernetes.io/projected/52297ca2-f082-428f-bcb3-aaf6d37e354f-kube-api-access-229pq\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.739851 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-etcd-client\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.739875 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1185a87-4827-4668-a07c-fb227c9c4213-apiservice-cert\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.740870 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.743268 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1185a87-4827-4668-a07c-fb227c9c4213-webhook-cert\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.743341 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d526e7bd-199d-4d9f-8826-6eee8fc0fa8d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sq5x4\" (UID: \"d526e7bd-199d-4d9f-8826-6eee8fc0fa8d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.743481 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.744189 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-serving-cert\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.744526 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/288d90b3-91c8-4768-8920-939d1e515807-audit-policies\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.744545 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52297ca2-f082-428f-bcb3-aaf6d37e354f-config-volume\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.744807 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.745016 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-registry-certificates\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.745051 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a4e1fbe-d463-46af-8e64-223fd290f89c-metrics-tls\") pod \"dns-operator-744455d44c-jz9dr\" (UID: \"0a4e1fbe-d463-46af-8e64-223fd290f89c\") " pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.745383 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-trusted-ca\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.749540 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-trusted-ca\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.754549 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-registry-certificates\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.755214 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68b57ac9-657f-4e92-a71c-a99b52c3c79c-serving-cert\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.755378 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.755517 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rc5z5"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.755783 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a4e1fbe-d463-46af-8e64-223fd290f89c-metrics-tls\") pod \"dns-operator-744455d44c-jz9dr\" (UID: \"0a4e1fbe-d463-46af-8e64-223fd290f89c\") " pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.756152 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.756159 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.756293 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.756335 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e65d8d96-ac46-4fa8-b940-55290af981d5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.756817 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-registry-tls\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.757003 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mftk7\" (UniqueName: \"kubernetes.io/projected/3774af40-0437-4166-abb6-f283ac1e97e5-kube-api-access-mftk7\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.757040 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.757133 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3774af40-0437-4166-abb6-f283ac1e97e5-certs\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.757166 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07103b35-ea08-4d06-b981-d04736a21d17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.757202 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff7gv\" (UniqueName: \"kubernetes.io/projected/d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2-kube-api-access-ff7gv\") pod \"package-server-manager-789f6589d5-x6t8n\" (UID: \"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.757223 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-policies\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.757766 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-policies\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.760072 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.760074 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-encryption-config\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.760733 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5f92557-8211-484e-b0f6-c7cd21998ca6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.760788 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f92557-8211-484e-b0f6-c7cd21998ca6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.761406 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f92557-8211-484e-b0f6-c7cd21998ca6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.766750 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5f92557-8211-484e-b0f6-c7cd21998ca6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.777164 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v8f2\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-kube-api-access-6v8f2\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.783607 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5fwm\" (UniqueName: \"kubernetes.io/projected/d526e7bd-199d-4d9f-8826-6eee8fc0fa8d-kube-api-access-q5fwm\") pod \"control-plane-machine-set-operator-78cbb6b69f-sq5x4\" (UID: \"d526e7bd-199d-4d9f-8826-6eee8fc0fa8d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.803261 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.821278 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.827202 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e65d8d96-ac46-4fa8-b940-55290af981d5-images\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.830341 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e65d8d96-ac46-4fa8-b940-55290af981d5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.837457 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r7z7\" (UniqueName: \"kubernetes.io/projected/288d90b3-91c8-4768-8920-939d1e515807-kube-api-access-9r7z7\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.844504 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-serving-cert\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.844561 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-etcd-client\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.844814 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/288d90b3-91c8-4768-8920-939d1e515807-encryption-config\") pod \"apiserver-7bbb656c7d-478bd\" (UID: \"288d90b3-91c8-4768-8920-939d1e515807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.856365 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsp57\" (UniqueName: \"kubernetes.io/projected/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-kube-api-access-nsp57\") pod \"oauth-openshift-558db77b4-kzs26\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.861918 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862190 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6ff4b11-2578-4861-99ad-355202e9f320-cert\") pod \"ingress-canary-wzknm\" (UID: \"f6ff4b11-2578-4861-99ad-355202e9f320\") " pod="openshift-ingress-canary/ingress-canary-wzknm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862245 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-plugins-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862269 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbkdm\" (UniqueName: \"kubernetes.io/projected/3475dad0-af15-4cbe-b43c-640fcebd0873-kube-api-access-hbkdm\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862319 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3774af40-0437-4166-abb6-f283ac1e97e5-node-bootstrap-token\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862340 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-socket-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862357 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-229pq\" (UniqueName: \"kubernetes.io/projected/52297ca2-f082-428f-bcb3-aaf6d37e354f-kube-api-access-229pq\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862382 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52297ca2-f082-428f-bcb3-aaf6d37e354f-config-volume\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862402 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mftk7\" (UniqueName: \"kubernetes.io/projected/3774af40-0437-4166-abb6-f283ac1e97e5-kube-api-access-mftk7\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862419 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3774af40-0437-4166-abb6-f283ac1e97e5-certs\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862444 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52297ca2-f082-428f-bcb3-aaf6d37e354f-metrics-tls\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862471 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-csi-data-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862510 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-mountpoint-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862541 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ptfc\" (UniqueName: \"kubernetes.io/projected/f6ff4b11-2578-4861-99ad-355202e9f320-kube-api-access-4ptfc\") pod \"ingress-canary-wzknm\" (UID: \"f6ff4b11-2578-4861-99ad-355202e9f320\") " pod="openshift-ingress-canary/ingress-canary-wzknm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862576 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-registration-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.862848 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-registration-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: E0930 18:48:23.862920 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.362904623 +0000 UTC m=+144.022384737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.864245 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52297ca2-f082-428f-bcb3-aaf6d37e354f-config-volume\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.864324 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-socket-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.864468 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-mountpoint-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.864528 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-csi-data-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.864556 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3475dad0-af15-4cbe-b43c-640fcebd0873-plugins-dir\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.868151 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3774af40-0437-4166-abb6-f283ac1e97e5-node-bootstrap-token\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.869466 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3774af40-0437-4166-abb6-f283ac1e97e5-certs\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.870040 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52297ca2-f082-428f-bcb3-aaf6d37e354f-metrics-tls\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.870849 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6ff4b11-2578-4861-99ad-355202e9f320-cert\") pod \"ingress-canary-wzknm\" (UID: \"f6ff4b11-2578-4861-99ad-355202e9f320\") " pod="openshift-ingress-canary/ingress-canary-wzknm" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.874030 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmgn\" (UniqueName: \"kubernetes.io/projected/d1185a87-4827-4668-a07c-fb227c9c4213-kube-api-access-8cmgn\") pod \"packageserver-d55dfcdfc-2p5vp\" (UID: \"d1185a87-4827-4668-a07c-fb227c9c4213\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.883624 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhzx\" (UniqueName: \"kubernetes.io/projected/68b57ac9-657f-4e92-a71c-a99b52c3c79c-kube-api-access-jdhzx\") pod \"service-ca-operator-777779d784-tf7xk\" (UID: \"68b57ac9-657f-4e92-a71c-a99b52c3c79c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.903547 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vd4c\" (UniqueName: \"kubernetes.io/projected/d5f92557-8211-484e-b0f6-c7cd21998ca6-kube-api-access-5vd4c\") pod \"kube-storage-version-migrator-operator-b67b599dd-mf2xg\" (UID: \"d5f92557-8211-484e-b0f6-c7cd21998ca6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.911265 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" event={"ID":"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e","Type":"ContainerStarted","Data":"951c714599f9df8f002683c2d6b86daca2f24182bfe4986c0702c9d1b756b747"} Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.911353 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" event={"ID":"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e","Type":"ContainerStarted","Data":"b3798a01497b01af94250b0da4f8f2761e6129f47147fd98068179340d490c9c"} Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.924437 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.925676 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk9wd\" (UniqueName: \"kubernetes.io/projected/66d8c362-4cc5-48b4-898d-1ab38933bcd7-kube-api-access-rk9wd\") pod \"migrator-59844c95c7-z4ch7\" (UID: \"66d8c362-4cc5-48b4-898d-1ab38933bcd7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.947982 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6xh4b" event={"ID":"0971ea8b-120a-4f19-85d8-a1f349d91c8f","Type":"ContainerStarted","Data":"d61489f217567337f61e88c6383b28e7c868a6795a234707a526f47839ef4bc9"} Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.957395 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwdk\" (UniqueName: \"kubernetes.io/projected/e65d8d96-ac46-4fa8-b940-55290af981d5-kube-api-access-5jwdk\") pod \"machine-config-operator-74547568cd-fml97\" (UID: \"e65d8d96-ac46-4fa8-b940-55290af981d5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.957704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" event={"ID":"1eed6452-36c9-4768-aa14-dfccd421e67c","Type":"ContainerStarted","Data":"e14b669df60a3bb52d02bface6f6adc9e29338044d3f7c9af1f682faf92c93e6"} Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.964067 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: E0930 18:48:23.964543 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.464521577 +0000 UTC m=+144.124001691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.969369 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-49glq"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.972631 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-bound-sa-token\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.972855 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd"] Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.975783 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" event={"ID":"d8931532-e4ee-4af7-b128-eddc57597a19","Type":"ContainerStarted","Data":"5ca8268c7b9de1057df551163c36490da221e416cbe109556d2e21ccbefe75d2"} Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.979571 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.982262 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" event={"ID":"e603a69d-3bfb-4cc5-bf21-b958b5f17b78","Type":"ContainerStarted","Data":"7808e64f044f3fbcd022f846ed9b55319bf96b7c38c6fb339eb8ce7ad7d74aff"} Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.983208 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" event={"ID":"114885db-d01b-4a85-8b3a-74585b9c1f13","Type":"ContainerStarted","Data":"12a00a7310b7e528b765a4b3e26f71e6fca49319a8b8be66f392073b7833a50f"} Sep 30 18:48:23 crc kubenswrapper[4747]: I0930 18:48:23.990804 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" event={"ID":"40c24418-02aa-4b76-aacc-4746107edc63","Type":"ContainerStarted","Data":"d7a69ab61f99f713f622d028c277aa265c6e33140efd573719a18b16cf1a5f2a"} Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.000633 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.008038 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h7w64"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.015637 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" event={"ID":"708b7382-ffc3-42e3-ac45-e1776b18473e","Type":"ContainerStarted","Data":"7cf2fb1e602fd8ce1b53c7c225da58caad78e631f04228f38c082877c51a8a68"} Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.016322 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.016630 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7c7\" (UniqueName: \"kubernetes.io/projected/0a4e1fbe-d463-46af-8e64-223fd290f89c-kube-api-access-gg7c7\") pod \"dns-operator-744455d44c-jz9dr\" (UID: \"0a4e1fbe-d463-46af-8e64-223fd290f89c\") " pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.021524 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" event={"ID":"00a32b13-e38c-424a-8db2-92ea1032208b","Type":"ContainerStarted","Data":"79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315"} Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.021981 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.024714 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff7gv\" (UniqueName: \"kubernetes.io/projected/d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2-kube-api-access-ff7gv\") pod \"package-server-manager-789f6589d5-x6t8n\" (UID: \"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.024767 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.026221 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" event={"ID":"ac3ae18e-3856-485b-9d77-f788989f86df","Type":"ContainerStarted","Data":"c214908b76c0d4c0cab374310a3100dad39a2bc90e0c7fee0efe0b7e19e30298"} Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.035884 4747 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-vcdjt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.035938 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" podUID="00a32b13-e38c-424a-8db2-92ea1032208b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.038432 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6hd\" (UniqueName: \"kubernetes.io/projected/0007394a-7089-407b-ad0f-25c9794ccefe-kube-api-access-2q6hd\") pod \"marketplace-operator-79b997595-8kfbg\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.046203 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.066329 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.072059 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.074280 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.074323 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m"] Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.074399 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.574377932 +0000 UTC m=+144.233858046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.079801 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mftk7\" (UniqueName: \"kubernetes.io/projected/3774af40-0437-4166-abb6-f283ac1e97e5-kube-api-access-mftk7\") pod \"machine-config-server-8jxgp\" (UID: \"3774af40-0437-4166-abb6-f283ac1e97e5\") " pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.081492 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.086640 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-229pq\" (UniqueName: \"kubernetes.io/projected/52297ca2-f082-428f-bcb3-aaf6d37e354f-kube-api-access-229pq\") pod \"dns-default-gbqbf\" (UID: \"52297ca2-f082-428f-bcb3-aaf6d37e354f\") " pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.109903 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.129876 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbkdm\" (UniqueName: \"kubernetes.io/projected/3475dad0-af15-4cbe-b43c-640fcebd0873-kube-api-access-hbkdm\") pod \"csi-hostpathplugin-4p8pb\" (UID: \"3475dad0-af15-4cbe-b43c-640fcebd0873\") " pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.134649 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ptfc\" (UniqueName: \"kubernetes.io/projected/f6ff4b11-2578-4861-99ad-355202e9f320-kube-api-access-4ptfc\") pod \"ingress-canary-wzknm\" (UID: \"f6ff4b11-2578-4861-99ad-355202e9f320\") " pod="openshift-ingress-canary/ingress-canary-wzknm" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.139672 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.154245 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.158827 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6tvj9"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.173702 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.174073 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.674060643 +0000 UTC m=+144.333540757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.187251 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.192185 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.234452 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8jxgp" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.241956 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.271242 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.274078 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.274463 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.774448358 +0000 UTC m=+144.433928472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.274972 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wzknm" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.375477 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.376149 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.876136384 +0000 UTC m=+144.535616488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.428850 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sn6w4"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.476731 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.477064 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:24.977049256 +0000 UTC m=+144.636529370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.487461 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.487513 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.527819 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg"] Sep 30 18:48:24 crc kubenswrapper[4747]: W0930 18:48:24.573585 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f90c236_a235_4782_8351_cad3bb90e3fa.slice/crio-41a27d36da8c39dcb547d5eed494bc54d3c97dbf24a3af90d8d4a9235d4fda25 WatchSource:0}: Error finding container 41a27d36da8c39dcb547d5eed494bc54d3c97dbf24a3af90d8d4a9235d4fda25: Status 404 returned error can't find the container with id 41a27d36da8c39dcb547d5eed494bc54d3c97dbf24a3af90d8d4a9235d4fda25 Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.585315 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.585702 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.085687341 +0000 UTC m=+144.745167455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.590120 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-56pzb"] Sep 30 18:48:24 crc kubenswrapper[4747]: W0930 18:48:24.596698 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50c8deee_cff0_4e24_ba8b_67116891e3ae.slice/crio-4cce2e9bb10bb205b6528833d7cb3e52e27c71ad5b57c334468e4a95a3ec9637 WatchSource:0}: Error finding container 4cce2e9bb10bb205b6528833d7cb3e52e27c71ad5b57c334468e4a95a3ec9637: Status 404 returned error can't find the container with id 4cce2e9bb10bb205b6528833d7cb3e52e27c71ad5b57c334468e4a95a3ec9637 Sep 30 18:48:24 crc kubenswrapper[4747]: W0930 18:48:24.641715 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b5cf81a_9d0e_4f3f_8596_5c1e17f87431.slice/crio-726cc1926c1df328c4721b3ca9480e07217c849d6878ce4d11442919760d5641 WatchSource:0}: Error finding container 726cc1926c1df328c4721b3ca9480e07217c849d6878ce4d11442919760d5641: Status 404 returned error can't find the container with id 726cc1926c1df328c4721b3ca9480e07217c849d6878ce4d11442919760d5641 Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.657497 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7jqj9"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.686612 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.686998 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.186952164 +0000 UTC m=+144.846432278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.788824 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" podStartSLOduration=122.788802456 podStartE2EDuration="2m2.788802456s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:24.788267519 +0000 UTC m=+144.447747633" watchObservedRunningTime="2025-09-30 18:48:24.788802456 +0000 UTC m=+144.448282570" Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.789648 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.790041 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.290028106 +0000 UTC m=+144.949508220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.872080 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fml97"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.876045 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.880781 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.892091 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.892782 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.392755487 +0000 UTC m=+145.052235631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.900248 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.914613 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kzs26"] Sep 30 18:48:24 crc kubenswrapper[4747]: I0930 18:48:24.994833 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:24 crc kubenswrapper[4747]: E0930 18:48:24.995536 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.495523119 +0000 UTC m=+145.155003233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.082077 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" event={"ID":"708b7382-ffc3-42e3-ac45-e1776b18473e","Type":"ContainerStarted","Data":"86fcda526a616fd7453e78dc313a4900206cbe1c8cc5fc2f21422469091da193"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.096145 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.100441 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.6004021 +0000 UTC m=+145.259882214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.109786 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" event={"ID":"40c24418-02aa-4b76-aacc-4746107edc63","Type":"ContainerStarted","Data":"732b454d55d7c26d102a9618928210c374818b2093b873fe6079343ed28b4079"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.119994 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" event={"ID":"e65d8d96-ac46-4fa8-b940-55290af981d5","Type":"ContainerStarted","Data":"c1080903dddad9bc46f8099027f7c148e7ce83f794668947d22a32f605eba3c5"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.127393 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" event={"ID":"e603a69d-3bfb-4cc5-bf21-b958b5f17b78","Type":"ContainerStarted","Data":"1fa81eab45008e9923d3766aa7ed010877884b0bc9388c98e2415d2a40175d65"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.143061 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jz9dr"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.149100 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" event={"ID":"f9ff24dd-26ea-4c4a-8c7b-171092fb7666","Type":"ContainerStarted","Data":"d5d94e999fcb1394628785a4e291ddaf10056a840c1cd8c6a9ed9e3f216bf242"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.191667 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.198348 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.199572 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.699542334 +0000 UTC m=+145.359022608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.202286 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h7w64" event={"ID":"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d","Type":"ContainerStarted","Data":"aacd60c3b9f0af6e243b77b2a00b8166bd4df3fa52454be85f0e4ac852f446a1"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.216992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sn6w4" event={"ID":"5f90c236-a235-4782-8351-cad3bb90e3fa","Type":"ContainerStarted","Data":"41a27d36da8c39dcb547d5eed494bc54d3c97dbf24a3af90d8d4a9235d4fda25"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.224718 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" event={"ID":"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917","Type":"ContainerStarted","Data":"74e23cb7573787751dece9acfae1fd393a20438930241d7eded852d61c3e5bb5"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.227925 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" event={"ID":"d8931532-e4ee-4af7-b128-eddc57597a19","Type":"ContainerStarted","Data":"51b86cc2daea7f36966380bb730fa97ce5d452d8e317a4480c629b4038c2264e"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.232845 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kfbg"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.241277 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" event={"ID":"114885db-d01b-4a85-8b3a-74585b9c1f13","Type":"ContainerStarted","Data":"0c0dea520af2a84f4368179d55ec785739ebee43fc1969bd27e3f733278837bf"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.241328 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" event={"ID":"114885db-d01b-4a85-8b3a-74585b9c1f13","Type":"ContainerStarted","Data":"113a9a6618d42acecced43644b2f9218ce0da27cbb458ed8ae0dc8143b5db8cb"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.241780 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.244759 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-49glq" event={"ID":"e882c7b3-ab9f-422d-acf9-da9234ecbc6a","Type":"ContainerStarted","Data":"672537693e9e53f3951ab26ca0f3200a189fd3c5ff80e57fa49199a8c6df89bf"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.244859 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-49glq" event={"ID":"e882c7b3-ab9f-422d-acf9-da9234ecbc6a","Type":"ContainerStarted","Data":"bd7e1a3507b31972f8e8325a7f95ccebd8fdbdc107fce3aa55a4781174e0b8ac"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.246694 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" event={"ID":"1eed6452-36c9-4768-aa14-dfccd421e67c","Type":"ContainerStarted","Data":"96786438d8ed0f12c62d4ef85b3e8e57a5160b12352ff2124d7bfb25b4e2deda"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.248300 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.250626 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.257109 4747 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lkqvn container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.257159 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" podUID="1eed6452-36c9-4768-aa14-dfccd421e67c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.257590 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" event={"ID":"273449d8-4695-48b9-835e-80756ba8cc1a","Type":"ContainerStarted","Data":"160feee8f581871bfe46b9bd2a551b626e21905375b907accfeb63b3950d4473"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.259263 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" event={"ID":"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2","Type":"ContainerStarted","Data":"23d873a24e5c08ea6712b4a0b216b70b1227a1be653bf2e38701159f15b4b1f9"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.262254 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" event={"ID":"e53e2da5-c4a6-42ae-a59b-a064f1c8756b","Type":"ContainerStarted","Data":"def72629eb1b2f26a0165f54a8c1c31aed4e2fc4eefcdc36b3a52b6809cd72d1"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.267479 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmpsg"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.287208 4747 generic.go:334] "Generic (PLEG): container finished" podID="3d3f0aa4-95db-4128-a2ec-1c59eb91c18e" containerID="951c714599f9df8f002683c2d6b86daca2f24182bfe4986c0702c9d1b756b747" exitCode=0 Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.287303 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" event={"ID":"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e","Type":"ContainerDied","Data":"951c714599f9df8f002683c2d6b86daca2f24182bfe4986c0702c9d1b756b747"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.297294 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" event={"ID":"0f1b807e-a90f-45d0-9f65-04a2c1442c40","Type":"ContainerStarted","Data":"8bdbb20e8ae2af7377c896381a2fac4031019ae247c128b14209933bc4643d63"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.300477 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.301676 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" event={"ID":"2d4b405d-20dc-4706-93d8-0ce1197b654b","Type":"ContainerStarted","Data":"8e628e11f322144c7d15b7e5aea47baee4c9766b0af9d0466bfc1e9d29f619c0"} Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.302170 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.80214143 +0000 UTC m=+145.461621544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.303075 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-56pzb" event={"ID":"0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4","Type":"ContainerStarted","Data":"6ea8f0a5f503bbd35656e998ce8c4150655478f42e882dd1cf84f24292db5882"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.304839 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" event={"ID":"4a1decac-0009-405b-9e01-8669eb06a74e","Type":"ContainerStarted","Data":"4dafacf0011ba98dba68aa2f4d829042eed826868a0c6cdcbb6637fae0a0e247"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.312335 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6xh4b" event={"ID":"0971ea8b-120a-4f19-85d8-a1f349d91c8f","Type":"ContainerStarted","Data":"29b25611eed16612d9bc2b73539f66dbdf61d5f7feb55c27a8d6199f831e652a"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.321750 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8jxgp" event={"ID":"3774af40-0437-4166-abb6-f283ac1e97e5","Type":"ContainerStarted","Data":"53a8ab6c81701476865f0d80f8d4ec656b4270c81e3687b94ddc862bcab0b9ba"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.342087 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" event={"ID":"ac3ae18e-3856-485b-9d77-f788989f86df","Type":"ContainerStarted","Data":"403c22ff4861c05ea8b73b1447ee17064a947effd1615cf7639e9ef655d9a60c"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.346210 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" event={"ID":"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf","Type":"ContainerStarted","Data":"cffddcbccd2f54ee69df78f5e6b2e3f0b347deb8ab1a59162928cda131523398"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.348704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" event={"ID":"d1185a87-4827-4668-a07c-fb227c9c4213","Type":"ContainerStarted","Data":"379d8aa0343ce6de9528a32ba0ee5b2d578dedb6afabbb8bc50feea05c638e66"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.352434 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" event={"ID":"50c8deee-cff0-4e24-ba8b-67116891e3ae","Type":"ContainerStarted","Data":"4cce2e9bb10bb205b6528833d7cb3e52e27c71ad5b57c334468e4a95a3ec9637"} Sep 30 18:48:25 crc kubenswrapper[4747]: W0930 18:48:25.374823 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0007394a_7089_407b_ad0f_25c9794ccefe.slice/crio-a6cf40e76bc6ff640438c172691b7cc4c3980fe848b9ac721732e01b2ed2dc27 WatchSource:0}: Error finding container a6cf40e76bc6ff640438c172691b7cc4c3980fe848b9ac721732e01b2ed2dc27: Status 404 returned error can't find the container with id a6cf40e76bc6ff640438c172691b7cc4c3980fe848b9ac721732e01b2ed2dc27 Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.387311 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" event={"ID":"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431","Type":"ContainerStarted","Data":"726cc1926c1df328c4721b3ca9480e07217c849d6878ce4d11442919760d5641"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.396443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" event={"ID":"dae6e892-a5c9-493d-825c-49f6181a0f41","Type":"ContainerStarted","Data":"39467e270320e0d59f788bf8bba61fc40327c394189bfc89da04fc5256a1e7b6"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.402202 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.405326 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:25.905310856 +0000 UTC m=+145.564790970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.413174 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" event={"ID":"cb16f716-b5a6-4885-8b9c-324a0f86c52a","Type":"ContainerStarted","Data":"c073f692f475ae69cb28d5a78f8e9cf7da70f446226c0fbd1e9a29fc7ad75711"} Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.413219 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.421574 4747 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wlwjd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.421635 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" podUID="cb16f716-b5a6-4885-8b9c-324a0f86c52a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.422526 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.435937 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4p8pb"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.463036 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wzknm"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.475130 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.503042 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gbqbf"] Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.503316 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.507075 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.007050124 +0000 UTC m=+145.666530238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: W0930 18:48:25.527003 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3475dad0_af15_4cbe_b43c_640fcebd0873.slice/crio-ed35600900c1fd833b08e2a6183f4f9b1b2035d64288b864d0752c4c5a6f4f5f WatchSource:0}: Error finding container ed35600900c1fd833b08e2a6183f4f9b1b2035d64288b864d0752c4c5a6f4f5f: Status 404 returned error can't find the container with id ed35600900c1fd833b08e2a6183f4f9b1b2035d64288b864d0752c4c5a6f4f5f Sep 30 18:48:25 crc kubenswrapper[4747]: W0930 18:48:25.531495 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66d8c362_4cc5_48b4_898d_1ab38933bcd7.slice/crio-56b71bea856e8b8ec5e5789f849f8e133b945ad3cab6655ac0208fcc7d68a826 WatchSource:0}: Error finding container 56b71bea856e8b8ec5e5789f849f8e133b945ad3cab6655ac0208fcc7d68a826: Status 404 returned error can't find the container with id 56b71bea856e8b8ec5e5789f849f8e133b945ad3cab6655ac0208fcc7d68a826 Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.606459 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.606861 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.106848029 +0000 UTC m=+145.766328143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: W0930 18:48:25.625202 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52297ca2_f082_428f_bcb3_aaf6d37e354f.slice/crio-79b6b9630327d2cffa0d10ca38e70c913a4d6e0712079e4d132aadd56a9616b5 WatchSource:0}: Error finding container 79b6b9630327d2cffa0d10ca38e70c913a4d6e0712079e4d132aadd56a9616b5: Status 404 returned error can't find the container with id 79b6b9630327d2cffa0d10ca38e70c913a4d6e0712079e4d132aadd56a9616b5 Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.625584 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.639193 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:25 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:25 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:25 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.639255 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.656188 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" podStartSLOduration=123.656161027 podStartE2EDuration="2m3.656161027s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:25.64985646 +0000 UTC m=+145.309336574" watchObservedRunningTime="2025-09-30 18:48:25.656161027 +0000 UTC m=+145.315641141" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.685601 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44z9c" podStartSLOduration=123.685582353 podStartE2EDuration="2m3.685582353s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:25.685360836 +0000 UTC m=+145.344840940" watchObservedRunningTime="2025-09-30 18:48:25.685582353 +0000 UTC m=+145.345062467" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.714257 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.714555 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.214538893 +0000 UTC m=+145.874019007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.764886 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6xh4b" podStartSLOduration=123.764860184 podStartE2EDuration="2m3.764860184s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:25.764389699 +0000 UTC m=+145.423869823" watchObservedRunningTime="2025-09-30 18:48:25.764860184 +0000 UTC m=+145.424340298" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.816849 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.817608 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.317387528 +0000 UTC m=+145.976867642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.879260 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" podStartSLOduration=123.879240258 podStartE2EDuration="2m3.879240258s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:25.849865334 +0000 UTC m=+145.509345448" watchObservedRunningTime="2025-09-30 18:48:25.879240258 +0000 UTC m=+145.538720372" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.880459 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2mxtk" podStartSLOduration=123.880452017 podStartE2EDuration="2m3.880452017s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:25.879350681 +0000 UTC m=+145.538830805" watchObservedRunningTime="2025-09-30 18:48:25.880452017 +0000 UTC m=+145.539932121" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.918587 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:25 crc kubenswrapper[4747]: E0930 18:48:25.919600 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.419578471 +0000 UTC m=+146.079058585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.934594 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8jxgp" podStartSLOduration=5.934570263 podStartE2EDuration="5.934570263s" podCreationTimestamp="2025-09-30 18:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:25.933600902 +0000 UTC m=+145.593081016" watchObservedRunningTime="2025-09-30 18:48:25.934570263 +0000 UTC m=+145.594050377" Sep 30 18:48:25 crc kubenswrapper[4747]: I0930 18:48:25.960743 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" podStartSLOduration=123.960728642 podStartE2EDuration="2m3.960728642s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:25.959489031 +0000 UTC m=+145.618969145" watchObservedRunningTime="2025-09-30 18:48:25.960728642 +0000 UTC m=+145.620208756" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.005486 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-49glq" podStartSLOduration=124.00546756 podStartE2EDuration="2m4.00546756s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.00365104 +0000 UTC m=+145.663131154" watchObservedRunningTime="2025-09-30 18:48:26.00546756 +0000 UTC m=+145.664947674" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.021092 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.021419 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.521405843 +0000 UTC m=+146.180885947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.043395 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dg925" podStartSLOduration=124.043373794 podStartE2EDuration="2m4.043373794s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.040123307 +0000 UTC m=+145.699603421" watchObservedRunningTime="2025-09-30 18:48:26.043373794 +0000 UTC m=+145.702853908" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.121887 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.122283 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.622269173 +0000 UTC m=+146.281749277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.127733 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5h8pq" podStartSLOduration=125.12768614 podStartE2EDuration="2m5.12768614s" podCreationTimestamp="2025-09-30 18:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.084976109 +0000 UTC m=+145.744456223" watchObservedRunningTime="2025-09-30 18:48:26.12768614 +0000 UTC m=+145.787166254" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.128474 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" podStartSLOduration=124.128468116 podStartE2EDuration="2m4.128468116s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.127435732 +0000 UTC m=+145.786915856" watchObservedRunningTime="2025-09-30 18:48:26.128468116 +0000 UTC m=+145.787948220" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.223126 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.223510 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.723496234 +0000 UTC m=+146.382976348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.324486 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.325997 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.825978707 +0000 UTC m=+146.485458821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.434475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.434825 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:26.934812429 +0000 UTC m=+146.594292543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.444870 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" event={"ID":"05ea429b-cd6a-466f-a2ff-d469a1ed572c","Type":"ContainerStarted","Data":"69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.444941 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" event={"ID":"05ea429b-cd6a-466f-a2ff-d469a1ed572c","Type":"ContainerStarted","Data":"538502bdb9208238c6326e90eee785b69c78f045d6bee25d107b8addfbaaeb3b"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.445862 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.449459 4747 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nmpsg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.449511 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" podUID="05ea429b-cd6a-466f-a2ff-d469a1ed572c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.464105 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" event={"ID":"40c24418-02aa-4b76-aacc-4746107edc63","Type":"ContainerStarted","Data":"0fbb9372a2781bf729b4c8bca6ba9d709b1d214677f2dc6319b139b968e6eadc"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.483405 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" event={"ID":"3d3f0aa4-95db-4128-a2ec-1c59eb91c18e","Type":"ContainerStarted","Data":"1f4caff42f7df5cbd337839a5530c267ba059ea6ccfbc072cca6efdcccb59a6d"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.483483 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.501112 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" podStartSLOduration=124.501092214 podStartE2EDuration="2m4.501092214s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.466135877 +0000 UTC m=+146.125615991" watchObservedRunningTime="2025-09-30 18:48:26.501092214 +0000 UTC m=+146.160572328" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.501209 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rc5z5" podStartSLOduration=124.501205777 podStartE2EDuration="2m4.501205777s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.500728612 +0000 UTC m=+146.160208726" watchObservedRunningTime="2025-09-30 18:48:26.501205777 +0000 UTC m=+146.160685891" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.546003 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.548368 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.048347054 +0000 UTC m=+146.707827168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.559073 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" podStartSLOduration=124.559053536 podStartE2EDuration="2m4.559053536s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.558385064 +0000 UTC m=+146.217865178" watchObservedRunningTime="2025-09-30 18:48:26.559053536 +0000 UTC m=+146.218533650" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.564215 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" event={"ID":"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917","Type":"ContainerStarted","Data":"4a0396381d1579c813d3f155679f2ce02208e7abd853ff6dcb6409186fc7b211"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.575828 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4vzsx" event={"ID":"f9ff24dd-26ea-4c4a-8c7b-171092fb7666","Type":"ContainerStarted","Data":"6a71c3b8ffc7ecc742b73f8fbb26b3d019ea0fe35b534d007a704bdbcd58733f"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.590260 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" event={"ID":"d5f92557-8211-484e-b0f6-c7cd21998ca6","Type":"ContainerStarted","Data":"e01aecf7fcf7aa1c1f34d262d80279f4d03168b1398c0d9636a21d2bada6ce57"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.590301 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" event={"ID":"d5f92557-8211-484e-b0f6-c7cd21998ca6","Type":"ContainerStarted","Data":"11f2e1622d1955b1e3975d0c79693e44b8fd021f373317e7c6eb83405b7a257b"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.592963 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pbzp4" event={"ID":"0f1b807e-a90f-45d0-9f65-04a2c1442c40","Type":"ContainerStarted","Data":"81dc00b21588e73b380af30546b3a5ba14c73ad401e7d521bf7292770910c8a5"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.594105 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wzknm" event={"ID":"f6ff4b11-2578-4861-99ad-355202e9f320","Type":"ContainerStarted","Data":"2d14dd1c5c376c2c52a33c9e915ae0c47ac0255ffb67e620cda6c46010b74f2f"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.594129 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wzknm" event={"ID":"f6ff4b11-2578-4861-99ad-355202e9f320","Type":"ContainerStarted","Data":"0c710f81f0370b9f72f3912c20fc8e35a08b95aa1bdbcb507c290475e8af2be9"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.620726 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-56pzb" event={"ID":"0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4","Type":"ContainerStarted","Data":"6fdc38bd91c5ec57c7e488e1a9648dc8b2dff979a68012ea5b462b9d1724368f"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.621476 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-56pzb" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.623140 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-56pzb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.623241 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56pzb" podUID="0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.630044 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:26 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:26 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:26 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.630103 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.639393 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mf2xg" podStartSLOduration=124.639378172 podStartE2EDuration="2m4.639378172s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.614202396 +0000 UTC m=+146.273682510" watchObservedRunningTime="2025-09-30 18:48:26.639378172 +0000 UTC m=+146.298858286" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.639481 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wzknm" podStartSLOduration=6.639476865 podStartE2EDuration="6.639476865s" podCreationTimestamp="2025-09-30 18:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.637061096 +0000 UTC m=+146.296541210" watchObservedRunningTime="2025-09-30 18:48:26.639476865 +0000 UTC m=+146.298956979" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.643327 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" event={"ID":"97bb37a5-2fa5-4580-8ba1-0ce90e3584cf","Type":"ContainerStarted","Data":"baedfc5dd4d0c8ba956762d8cba9d02e9a0e9a209677d9ae5132f27e59f76bd2"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.647579 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.649104 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.149090511 +0000 UTC m=+146.808570625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.670394 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" event={"ID":"288d90b3-91c8-4768-8920-939d1e515807","Type":"ContainerStarted","Data":"10e663ce81693763ca11a2e111819e4615ee181ea69c583d136fbc7ff79eafaf"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.695484 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" event={"ID":"3475dad0-af15-4cbe-b43c-640fcebd0873","Type":"ContainerStarted","Data":"ed35600900c1fd833b08e2a6183f4f9b1b2035d64288b864d0752c4c5a6f4f5f"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.701258 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-56pzb" podStartSLOduration=124.701244272 podStartE2EDuration="2m4.701244272s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.70057609 +0000 UTC m=+146.360056214" watchObservedRunningTime="2025-09-30 18:48:26.701244272 +0000 UTC m=+146.360724386" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.711382 4747 generic.go:334] "Generic (PLEG): container finished" podID="4a1decac-0009-405b-9e01-8669eb06a74e" containerID="e4e1bde7407bb6af9a94298e7e94b6c4e30200b00f70aac6748931a14c814a09" exitCode=0 Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.711771 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" event={"ID":"4a1decac-0009-405b-9e01-8669eb06a74e","Type":"ContainerDied","Data":"e4e1bde7407bb6af9a94298e7e94b6c4e30200b00f70aac6748931a14c814a09"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.735381 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" event={"ID":"d1185a87-4827-4668-a07c-fb227c9c4213","Type":"ContainerStarted","Data":"960d94283c8b59e8d099b0edbc8edfd874cf58e7fcdcd7768cef4a11e6d5c706"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.736598 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.737706 4747 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-2p5vp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" start-of-body= Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.737748 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" podUID="d1185a87-4827-4668-a07c-fb227c9c4213" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.748606 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.750118 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.250103245 +0000 UTC m=+146.909583359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.789761 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8cbk9" podStartSLOduration=124.789746796 podStartE2EDuration="2m4.789746796s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.789394335 +0000 UTC m=+146.448874449" watchObservedRunningTime="2025-09-30 18:48:26.789746796 +0000 UTC m=+146.449226910" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.806490 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8jxgp" event={"ID":"3774af40-0437-4166-abb6-f283ac1e97e5","Type":"ContainerStarted","Data":"7bd14724e787ff2c59481e92f8e6d77607db097528d777e13f682afc6bf5194e"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.809186 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" event={"ID":"0007394a-7089-407b-ad0f-25c9794ccefe","Type":"ContainerStarted","Data":"31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.809231 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" event={"ID":"0007394a-7089-407b-ad0f-25c9794ccefe","Type":"ContainerStarted","Data":"a6cf40e76bc6ff640438c172691b7cc4c3980fe848b9ac721732e01b2ed2dc27"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.810031 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.812688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" event={"ID":"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2","Type":"ContainerStarted","Data":"2b95a2732885db83de2dec75dac9369a465a8b0bd2411bd356dea977a7568f7f"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.812713 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" event={"ID":"d4e1a2e7-6d0e-4ffd-9b17-a2dd316adfc2","Type":"ContainerStarted","Data":"f86359045b6a53731fe220de3ef15f7e448bd9f96b33ae459c70d833d056436a"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.813132 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.813198 4747 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8kfbg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.813224 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" podUID="0007394a-7089-407b-ad0f-25c9794ccefe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.851997 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.853661 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.353647663 +0000 UTC m=+147.013127777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.854255 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" event={"ID":"dae6e892-a5c9-493d-825c-49f6181a0f41","Type":"ContainerStarted","Data":"978c434e76da81fc9b95cd95562225e4fdd316cf2d9aec4aa8ea7b69a21510c7"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.866993 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" event={"ID":"2d4b405d-20dc-4706-93d8-0ce1197b654b","Type":"ContainerStarted","Data":"ec901152180b6c605e6225af0d34f2e5716e71e4672fc9b77d242143a9ba9af5"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.893301 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gbqbf" event={"ID":"52297ca2-f082-428f-bcb3-aaf6d37e354f","Type":"ContainerStarted","Data":"79b6b9630327d2cffa0d10ca38e70c913a4d6e0712079e4d132aadd56a9616b5"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.907243 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" event={"ID":"d526e7bd-199d-4d9f-8826-6eee8fc0fa8d","Type":"ContainerStarted","Data":"56ab97a9d69bccda83103c4d9195d8340b6b835a93e73cfadefc0fddf12b5511"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.907295 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" event={"ID":"d526e7bd-199d-4d9f-8826-6eee8fc0fa8d","Type":"ContainerStarted","Data":"f49be580940c76cf725f61f3589e73d8d94f3de50ea55e112c3ef34ba83bbddb"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.921875 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" event={"ID":"68b57ac9-657f-4e92-a71c-a99b52c3c79c","Type":"ContainerStarted","Data":"ed5cdae32908e1301ddbcf621acd325b2a90701a1dfdf67852810a4d4a4e1917"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.921923 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" event={"ID":"68b57ac9-657f-4e92-a71c-a99b52c3c79c","Type":"ContainerStarted","Data":"8c9b7d8963982f0eaee4204f2e618d7ebe10ea8e5378942ca40d46a920d65a99"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.948373 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" podStartSLOduration=124.94834671 podStartE2EDuration="2m4.94834671s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.904556954 +0000 UTC m=+146.564037068" watchObservedRunningTime="2025-09-30 18:48:26.94834671 +0000 UTC m=+146.607826814" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.954765 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:26 crc kubenswrapper[4747]: E0930 18:48:26.956288 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.456272381 +0000 UTC m=+147.115752495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.958223 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" event={"ID":"e53e2da5-c4a6-42ae-a59b-a064f1c8756b","Type":"ContainerStarted","Data":"8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.959039 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.948983 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" podStartSLOduration=124.948977471 podStartE2EDuration="2m4.948977471s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:26.943415299 +0000 UTC m=+146.602895423" watchObservedRunningTime="2025-09-30 18:48:26.948977471 +0000 UTC m=+146.608457585" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.967740 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" event={"ID":"708b7382-ffc3-42e3-ac45-e1776b18473e","Type":"ContainerStarted","Data":"150d16a620580ccdceff26c434e10210e6ff64b9561f0ef57b4990037e41df9a"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.969024 4747 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kzs26 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" start-of-body= Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.969063 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" podUID="e53e2da5-c4a6-42ae-a59b-a064f1c8756b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.981062 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" event={"ID":"0a4e1fbe-d463-46af-8e64-223fd290f89c","Type":"ContainerStarted","Data":"a4853473e5b30da330bede7ee5b86d4d858b9cffce2ccc3011501772bae50108"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.982513 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" event={"ID":"cb16f716-b5a6-4885-8b9c-324a0f86c52a","Type":"ContainerStarted","Data":"bf5e29fe450582723daedc875fe4358d771e62914d638bd0dd94944539328a59"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.983570 4747 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wlwjd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.983610 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" podUID="cb16f716-b5a6-4885-8b9c-324a0f86c52a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.985152 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" event={"ID":"273449d8-4695-48b9-835e-80756ba8cc1a","Type":"ContainerStarted","Data":"ea7763e17fefda9a5a3a02e29c2081eb8fd3c8bf47282c878ee89f339638233b"} Sep 30 18:48:26 crc kubenswrapper[4747]: I0930 18:48:26.985182 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" event={"ID":"273449d8-4695-48b9-835e-80756ba8cc1a","Type":"ContainerStarted","Data":"8fa6a126ec6ca349a5f9b16eec22ff24d070a944eec657fb0062949e702b1dbf"} Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.003351 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" event={"ID":"50c8deee-cff0-4e24-ba8b-67116891e3ae","Type":"ContainerStarted","Data":"665a5284f0d261b538d028ce4baff7741762ead9e6ffd6bab8edd532dc04db74"} Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.005883 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-6tvj9" podStartSLOduration=125.005859118 podStartE2EDuration="2m5.005859118s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.000656497 +0000 UTC m=+146.660136621" watchObservedRunningTime="2025-09-30 18:48:27.005859118 +0000 UTC m=+146.665339232" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.023700 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h7w64" event={"ID":"1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d","Type":"ContainerStarted","Data":"8c1a389e61a9c62823f74802c0b28a46367b332e16309646b57bf64ff2c63912"} Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.040028 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.034102 4747 patch_prober.go:28] interesting pod/console-operator-58897d9998-h7w64 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.040115 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h7w64" podUID="1a6b71ed-56c5-4fd9-b918-fc6c3d668a7d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.031938 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" podStartSLOduration=125.031908123 podStartE2EDuration="2m5.031908123s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.031294803 +0000 UTC m=+146.690774917" watchObservedRunningTime="2025-09-30 18:48:27.031908123 +0000 UTC m=+146.691388237" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.057234 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sn6w4" event={"ID":"5f90c236-a235-4782-8351-cad3bb90e3fa","Type":"ContainerStarted","Data":"f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4"} Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.058475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.070258 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" event={"ID":"66d8c362-4cc5-48b4-898d-1ab38933bcd7","Type":"ContainerStarted","Data":"6fb237a219af7cc5f7e9877761253019a248226a80295e0d5e90f2389e295cb6"} Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.070313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" event={"ID":"66d8c362-4cc5-48b4-898d-1ab38933bcd7","Type":"ContainerStarted","Data":"56b71bea856e8b8ec5e5789f849f8e133b945ad3cab6655ac0208fcc7d68a826"} Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.087829 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" podStartSLOduration=125.087808547 podStartE2EDuration="2m5.087808547s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.067622835 +0000 UTC m=+146.727102949" watchObservedRunningTime="2025-09-30 18:48:27.087808547 +0000 UTC m=+146.747288661" Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.116734 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.616696855 +0000 UTC m=+147.276176969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.134434 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wzrrm" podStartSLOduration=125.134417517 podStartE2EDuration="2m5.134417517s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.120299954 +0000 UTC m=+146.779780068" watchObservedRunningTime="2025-09-30 18:48:27.134417517 +0000 UTC m=+146.793897621" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.168882 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.170812 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.670794711 +0000 UTC m=+147.330274825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.188317 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" event={"ID":"e65d8d96-ac46-4fa8-b940-55290af981d5","Type":"ContainerStarted","Data":"3ab71eef82b9ecd0066827fd44b15985369b6c2ac132894b7e031f96173080f2"} Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.193082 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" event={"ID":"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431","Type":"ContainerStarted","Data":"99a40801fd332ee6c2834af04f43342f0d34b48265ee3bc75add57c395955901"} Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.197867 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-sn6w4" podStartSLOduration=125.197854778 podStartE2EDuration="2m5.197854778s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.196040999 +0000 UTC m=+146.855521113" watchObservedRunningTime="2025-09-30 18:48:27.197854778 +0000 UTC m=+146.857334892" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.198524 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sq5x4" podStartSLOduration=125.19851958 podStartE2EDuration="2m5.19851958s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.171610827 +0000 UTC m=+146.831090951" watchObservedRunningTime="2025-09-30 18:48:27.19851958 +0000 UTC m=+146.857999684" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.222552 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lkqvn" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.260130 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-h7w64" podStartSLOduration=125.260109081 podStartE2EDuration="2m5.260109081s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.222917031 +0000 UTC m=+146.882397145" watchObservedRunningTime="2025-09-30 18:48:27.260109081 +0000 UTC m=+146.919589195" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.260908 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" podStartSLOduration=125.260904997 podStartE2EDuration="2m5.260904997s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.255548872 +0000 UTC m=+146.915028986" watchObservedRunningTime="2025-09-30 18:48:27.260904997 +0000 UTC m=+146.920385101" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.275144 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.278799 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.778781894 +0000 UTC m=+147.438262008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.286366 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tf7xk" podStartSLOduration=125.286350662 podStartE2EDuration="2m5.286350662s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.284676807 +0000 UTC m=+146.944156921" watchObservedRunningTime="2025-09-30 18:48:27.286350662 +0000 UTC m=+146.945830776" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.315881 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-r8c9m" podStartSLOduration=125.315859051 podStartE2EDuration="2m5.315859051s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.314785335 +0000 UTC m=+146.974265449" watchObservedRunningTime="2025-09-30 18:48:27.315859051 +0000 UTC m=+146.975339165" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.339464 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7n9gf" podStartSLOduration=125.339447495 podStartE2EDuration="2m5.339447495s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.338878346 +0000 UTC m=+146.998358460" watchObservedRunningTime="2025-09-30 18:48:27.339447495 +0000 UTC m=+146.998927609" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.368458 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" podStartSLOduration=125.368439836 podStartE2EDuration="2m5.368439836s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.367223076 +0000 UTC m=+147.026703190" watchObservedRunningTime="2025-09-30 18:48:27.368439836 +0000 UTC m=+147.027919950" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.376678 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.377124 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.87711043 +0000 UTC m=+147.536590544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.416843 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" podStartSLOduration=126.416824614 podStartE2EDuration="2m6.416824614s" podCreationTimestamp="2025-09-30 18:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.416278896 +0000 UTC m=+147.075759000" watchObservedRunningTime="2025-09-30 18:48:27.416824614 +0000 UTC m=+147.076304728" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.485257 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.485581 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:27.98556984 +0000 UTC m=+147.645049954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.586608 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.587063 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.087038459 +0000 UTC m=+147.746518573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.587307 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.587676 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.0876646 +0000 UTC m=+147.747144714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.624538 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:27 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:27 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:27 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.624618 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.688380 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.688712 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.188696555 +0000 UTC m=+147.848176669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.789723 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.790185 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.290163945 +0000 UTC m=+147.949644139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.891048 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.891501 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.39148045 +0000 UTC m=+148.050960564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:27 crc kubenswrapper[4747]: I0930 18:48:27.992322 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:27 crc kubenswrapper[4747]: E0930 18:48:27.993150 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.493133986 +0000 UTC m=+148.152614100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.093964 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.094355 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.594336407 +0000 UTC m=+148.253816521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.195122 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.195425 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.695413363 +0000 UTC m=+148.354893477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.199535 4747 generic.go:334] "Generic (PLEG): container finished" podID="288d90b3-91c8-4768-8920-939d1e515807" containerID="93b9ad94be0b901998b112acde94d24f8b941f4ca2a8a97178cc78538429822d" exitCode=0 Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.199595 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" event={"ID":"288d90b3-91c8-4768-8920-939d1e515807","Type":"ContainerStarted","Data":"a21ef7f46dd0a7dbb01d99d64b55720ddad19974e23d6e05d024f5be97bcceb2"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.199622 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" event={"ID":"288d90b3-91c8-4768-8920-939d1e515807","Type":"ContainerDied","Data":"93b9ad94be0b901998b112acde94d24f8b941f4ca2a8a97178cc78538429822d"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.201931 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gv745" event={"ID":"dae6e892-a5c9-493d-825c-49f6181a0f41","Type":"ContainerStarted","Data":"cd9f89b5c2de42bafdfe069d08f8718f1ba083f8369eacf281cc7124e523a8c6"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.203001 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" event={"ID":"3475dad0-af15-4cbe-b43c-640fcebd0873","Type":"ContainerStarted","Data":"d1f426bbf40ad797a5ba881bf0571268c9601d85ea81bcd981485fbf691d54ca"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.204372 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" event={"ID":"66d8c362-4cc5-48b4-898d-1ab38933bcd7","Type":"ContainerStarted","Data":"e246ab5eb91c4bde4e7ac695d0aa8a2faa46b7613aca79364c52453956330b32"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.206070 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fml97" event={"ID":"e65d8d96-ac46-4fa8-b940-55290af981d5","Type":"ContainerStarted","Data":"68a2ce41dea2edc1322ba99b21e7c023a778700e64b55b21a65b210cc39db953"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.208122 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" event={"ID":"0a4e1fbe-d463-46af-8e64-223fd290f89c","Type":"ContainerStarted","Data":"0f0d81ad800e3e691ca0ac9054bdf4e75b0b61ac67a14ddd2ce1c0348950dc1c"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.208150 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" event={"ID":"0a4e1fbe-d463-46af-8e64-223fd290f89c","Type":"ContainerStarted","Data":"912967c7fa44978ed60a5237339fbc36bb9fca5fc7637b55e16ccb775278d9d7"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.209875 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" event={"ID":"ce1b7795-da9d-4275-b9c5-f0dfe3cc9917","Type":"ContainerStarted","Data":"0bb5d841a5a5fce02c043d76be1077e6ce21b1af8dc96b75c1c0d67c60a81157"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.211392 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gbqbf" event={"ID":"52297ca2-f082-428f-bcb3-aaf6d37e354f","Type":"ContainerStarted","Data":"b022f0e9c22fbc767657cc38c1a73b20ae8ddc9fd59019bdf9039bba19b8e834"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.211417 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gbqbf" event={"ID":"52297ca2-f082-428f-bcb3-aaf6d37e354f","Type":"ContainerStarted","Data":"86bd867f5e46049c926c2c22883533ee17f877b57efb710ce61ab6ef446f00d5"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.211971 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.214600 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" event={"ID":"4a1decac-0009-405b-9e01-8669eb06a74e","Type":"ContainerStarted","Data":"c67cfa4bf35c1f0aaf413b8fb2658dfb83dc46c3ba6326244c5e142b796f8d71"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.214634 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" event={"ID":"4a1decac-0009-405b-9e01-8669eb06a74e","Type":"ContainerStarted","Data":"145cc6a18764038cafc65c3bfeb259ed257e6d4c08341ecea4c5c5ca990bf6a8"} Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.220052 4747 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8kfbg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.220094 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" podUID="0007394a-7089-407b-ad0f-25c9794ccefe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.221193 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-56pzb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.221244 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56pzb" podUID="0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.226155 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.237320 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wlwjd" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.269415 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" podStartSLOduration=126.269398591 podStartE2EDuration="2m6.269398591s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:28.267899012 +0000 UTC m=+147.927379126" watchObservedRunningTime="2025-09-30 18:48:28.269398591 +0000 UTC m=+147.928878705" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.270503 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z4ch7" podStartSLOduration=126.270496207 podStartE2EDuration="2m6.270496207s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:27.443876541 +0000 UTC m=+147.103356655" watchObservedRunningTime="2025-09-30 18:48:28.270496207 +0000 UTC m=+147.929976321" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.293116 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z45qm" podStartSLOduration=126.293098089 podStartE2EDuration="2m6.293098089s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:28.291586909 +0000 UTC m=+147.951067043" watchObservedRunningTime="2025-09-30 18:48:28.293098089 +0000 UTC m=+147.952578203" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.306440 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.308771 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.808748522 +0000 UTC m=+148.468228636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.431837 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.432210 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:28.932197723 +0000 UTC m=+148.591677837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.438680 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" podStartSLOduration=126.438640665 podStartE2EDuration="2m6.438640665s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:28.405083714 +0000 UTC m=+148.064563828" watchObservedRunningTime="2025-09-30 18:48:28.438640665 +0000 UTC m=+148.098120779" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.439593 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gbqbf" podStartSLOduration=8.439587056 podStartE2EDuration="8.439587056s" podCreationTimestamp="2025-09-30 18:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:28.436216235 +0000 UTC m=+148.095696349" watchObservedRunningTime="2025-09-30 18:48:28.439587056 +0000 UTC m=+148.099067170" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.465141 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jz9dr" podStartSLOduration=126.465122974 podStartE2EDuration="2m6.465122974s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:28.464215924 +0000 UTC m=+148.123696038" watchObservedRunningTime="2025-09-30 18:48:28.465122974 +0000 UTC m=+148.124603088" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.534891 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.535295 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.035273095 +0000 UTC m=+148.694753199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.535329 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.535732 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.03572539 +0000 UTC m=+148.695205504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.556133 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-74cqx" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.581154 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.581686 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.598508 4747 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7jqj9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.598595 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" podUID="4a1decac-0009-405b-9e01-8669eb06a74e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.623531 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:28 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:28 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:28 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.623601 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.636788 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.637366 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.137341364 +0000 UTC m=+148.796821478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.666412 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.738066 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.738355 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.238337658 +0000 UTC m=+148.897817962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.839617 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.839858 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.339818158 +0000 UTC m=+148.999298272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.840192 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.840789 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.34078058 +0000 UTC m=+149.000260694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.894141 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84v6x"] Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.897715 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.899159 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.907512 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2p5vp" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.913161 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84v6x"] Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.941372 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.941600 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmpzl\" (UniqueName: \"kubernetes.io/projected/26c98ff0-f864-4301-9423-f037408bce18-kube-api-access-jmpzl\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.941670 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-catalog-content\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.941739 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-utilities\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:28 crc kubenswrapper[4747]: E0930 18:48:28.941832 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.441816515 +0000 UTC m=+149.101296629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.980528 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.980594 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:28 crc kubenswrapper[4747]: I0930 18:48:28.991358 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-h7w64" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.045195 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmpzl\" (UniqueName: \"kubernetes.io/projected/26c98ff0-f864-4301-9423-f037408bce18-kube-api-access-jmpzl\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.045857 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-catalog-content\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.046036 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.046144 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-utilities\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.046662 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-utilities\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.046991 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-catalog-content\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.047321 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.547299407 +0000 UTC m=+149.206779521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.098455 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmpzl\" (UniqueName: \"kubernetes.io/projected/26c98ff0-f864-4301-9423-f037408bce18-kube-api-access-jmpzl\") pod \"community-operators-84v6x\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.098466 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qrfvq"] Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.099683 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.104260 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.138649 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrfvq"] Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.150460 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.150572 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.650550485 +0000 UTC m=+149.310030599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.150637 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42m7f\" (UniqueName: \"kubernetes.io/projected/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-kube-api-access-42m7f\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.150688 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-catalog-content\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.150718 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.150757 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-utilities\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.151051 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.651044081 +0000 UTC m=+149.310524195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.211411 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.246221 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" event={"ID":"3475dad0-af15-4cbe-b43c-640fcebd0873","Type":"ContainerStarted","Data":"a6074ebcab0460f3e43d3a659485954bf8ade0f4a2688a9b5aa5f0841cf9654e"} Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.251073 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-56pzb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.251088 4747 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8kfbg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.251131 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56pzb" podUID="0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.251151 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" podUID="0007394a-7089-407b-ad0f-25c9794ccefe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.261182 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.261493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42m7f\" (UniqueName: \"kubernetes.io/projected/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-kube-api-access-42m7f\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.261584 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-catalog-content\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.261704 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-utilities\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.262185 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-utilities\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.262258 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.76224451 +0000 UTC m=+149.421724624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.262649 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-catalog-content\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.311137 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mzhtj"] Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.312401 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.316673 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42m7f\" (UniqueName: \"kubernetes.io/projected/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-kube-api-access-42m7f\") pod \"certified-operators-qrfvq\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.330672 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mzhtj"] Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.375550 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-catalog-content\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.375772 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.376118 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnwzx\" (UniqueName: \"kubernetes.io/projected/61cf3ea9-c177-48e5-ab64-4300b58f4875-kube-api-access-fnwzx\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.376413 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-utilities\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.384104 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.884076348 +0000 UTC m=+149.543556462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.420226 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.481483 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.481640 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnwzx\" (UniqueName: \"kubernetes.io/projected/61cf3ea9-c177-48e5-ab64-4300b58f4875-kube-api-access-fnwzx\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.481685 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-utilities\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.481729 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-catalog-content\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.482369 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-catalog-content\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.482461 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:29.982440375 +0000 UTC m=+149.641920489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.482687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-utilities\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.519672 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jz555"] Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.521024 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.549063 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnwzx\" (UniqueName: \"kubernetes.io/projected/61cf3ea9-c177-48e5-ab64-4300b58f4875-kube-api-access-fnwzx\") pod \"community-operators-mzhtj\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.575748 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jz555"] Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.589164 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.589462 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:30.089450597 +0000 UTC m=+149.748930711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.671054 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:29 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:29 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:29 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.671103 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.691555 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.691910 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:30.191895009 +0000 UTC m=+149.851375123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.691964 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhjt\" (UniqueName: \"kubernetes.io/projected/90237139-34d3-4f31-8b81-0b34418c19ff-kube-api-access-gfhjt\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.692035 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-catalog-content\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.692054 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-utilities\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.703976 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.716101 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.792740 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.792797 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhjt\" (UniqueName: \"kubernetes.io/projected/90237139-34d3-4f31-8b81-0b34418c19ff-kube-api-access-gfhjt\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.792851 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-catalog-content\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.792871 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-utilities\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.793254 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-utilities\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.793498 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:30.293487372 +0000 UTC m=+149.952967486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.794020 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-catalog-content\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.817552 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhjt\" (UniqueName: \"kubernetes.io/projected/90237139-34d3-4f31-8b81-0b34418c19ff-kube-api-access-gfhjt\") pod \"certified-operators-jz555\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.893433 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.893858 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:30.393842936 +0000 UTC m=+150.053323050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.907829 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.954162 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrfvq"] Sep 30 18:48:29 crc kubenswrapper[4747]: W0930 18:48:29.981141 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9fc6cc_7437_4639_9c0a_05e8c2ce1042.slice/crio-031e65a485c574b55dbbd88ea40bd10679f14faaa67fdfcb89126f4087338659 WatchSource:0}: Error finding container 031e65a485c574b55dbbd88ea40bd10679f14faaa67fdfcb89126f4087338659: Status 404 returned error can't find the container with id 031e65a485c574b55dbbd88ea40bd10679f14faaa67fdfcb89126f4087338659 Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.986059 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84v6x"] Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.994573 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.994617 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.994637 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.994656 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.994703 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:29 crc kubenswrapper[4747]: E0930 18:48:29.996508 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:30.496492164 +0000 UTC m=+150.155972278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:29 crc kubenswrapper[4747]: I0930 18:48:29.998032 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.000634 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.015248 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.018627 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.096361 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:30 crc kubenswrapper[4747]: E0930 18:48:30.096751 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 18:48:30.596734204 +0000 UTC m=+150.256214308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.102135 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.115906 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.133241 4747 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.200626 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:30 crc kubenswrapper[4747]: E0930 18:48:30.200966 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 18:48:30.700954314 +0000 UTC m=+150.360434428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h88d" (UID: "07103b35-ea08-4d06-b981-d04736a21d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.256292 4747 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T18:48:30.133268003Z","Handler":null,"Name":""} Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.282788 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" event={"ID":"3475dad0-af15-4cbe-b43c-640fcebd0873","Type":"ContainerStarted","Data":"e82a854f0a4a497f8a1d5a684521486d600e1d45d48c3bc479b04f15d4943543"} Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.284007 4747 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.284037 4747 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.301385 4747 generic.go:334] "Generic (PLEG): container finished" podID="6b5cf81a-9d0e-4f3f-8596-5c1e17f87431" containerID="99a40801fd332ee6c2834af04f43342f0d34b48265ee3bc75add57c395955901" exitCode=0 Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.301493 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" event={"ID":"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431","Type":"ContainerDied","Data":"99a40801fd332ee6c2834af04f43342f0d34b48265ee3bc75add57c395955901"} Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.301569 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.306445 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84v6x" event={"ID":"26c98ff0-f864-4301-9423-f037408bce18","Type":"ContainerStarted","Data":"3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad"} Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.306485 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84v6x" event={"ID":"26c98ff0-f864-4301-9423-f037408bce18","Type":"ContainerStarted","Data":"a509c1dbfd54bbf384d65805647a5aadc50f970cfd087da96781f9c7e786dc87"} Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.307355 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.307732 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.317283 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrfvq" event={"ID":"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042","Type":"ContainerStarted","Data":"1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476"} Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.317318 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrfvq" event={"ID":"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042","Type":"ContainerStarted","Data":"031e65a485c574b55dbbd88ea40bd10679f14faaa67fdfcb89126f4087338659"} Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.327845 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-478bd" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.337063 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.338188 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mzhtj"] Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.402616 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.415406 4747 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.415447 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.433099 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jz555"] Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.509743 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h88d\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.608663 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.624354 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:30 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:30 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:30 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:30 crc kubenswrapper[4747]: I0930 18:48:30.624409 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.102725 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vs5"] Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.104307 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.108613 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.127337 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.137227 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vs5"] Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.230556 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-catalog-content\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.230609 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m66z5\" (UniqueName: \"kubernetes.io/projected/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-kube-api-access-m66z5\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.230665 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-utilities\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.296270 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h88d"] Sep 30 18:48:31 crc kubenswrapper[4747]: W0930 18:48:31.304498 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07103b35_ea08_4d06_b981_d04736a21d17.slice/crio-464f3ea0b76001a5a5ad3bc489a05c05f611dcd2c4699c9aff4fec9e7f0fedae WatchSource:0}: Error finding container 464f3ea0b76001a5a5ad3bc489a05c05f611dcd2c4699c9aff4fec9e7f0fedae: Status 404 returned error can't find the container with id 464f3ea0b76001a5a5ad3bc489a05c05f611dcd2c4699c9aff4fec9e7f0fedae Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.340458 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-catalog-content\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.340512 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m66z5\" (UniqueName: \"kubernetes.io/projected/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-kube-api-access-m66z5\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.340563 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-utilities\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.341082 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-catalog-content\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.342609 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-utilities\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.356768 4747 generic.go:334] "Generic (PLEG): container finished" podID="26c98ff0-f864-4301-9423-f037408bce18" containerID="3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad" exitCode=0 Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.357159 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84v6x" event={"ID":"26c98ff0-f864-4301-9423-f037408bce18","Type":"ContainerDied","Data":"3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.360214 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3033f9e419ed106a6e5234d3ae17e77a9628dd5c49b62f7cea53b7374a807b2d"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.360270 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"627d97c1ca8eb36c6884b5e20d7a67101fb3389a27979612ff80f6dac4899e0d"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.363255 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.367143 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c9dace18a160e29280d0b2ddb6008ae54064309de51f6730041f817c40839720"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.367198 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a8be2edfa8752c4e54783e54df1890f8c3f56b6baa25156bbcafa93af6d2e224"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.380587 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m66z5\" (UniqueName: \"kubernetes.io/projected/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-kube-api-access-m66z5\") pod \"redhat-marketplace-x2vs5\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.383719 4747 generic.go:334] "Generic (PLEG): container finished" podID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerID="1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476" exitCode=0 Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.383790 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrfvq" event={"ID":"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042","Type":"ContainerDied","Data":"1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.392519 4747 generic.go:334] "Generic (PLEG): container finished" podID="90237139-34d3-4f31-8b81-0b34418c19ff" containerID="9200cc3d58e817219c087aa3b84deaf5fce43f6ec5b92276893f0d34ee77f4c4" exitCode=0 Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.393679 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz555" event={"ID":"90237139-34d3-4f31-8b81-0b34418c19ff","Type":"ContainerDied","Data":"9200cc3d58e817219c087aa3b84deaf5fce43f6ec5b92276893f0d34ee77f4c4"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.393711 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz555" event={"ID":"90237139-34d3-4f31-8b81-0b34418c19ff","Type":"ContainerStarted","Data":"cae0ec302efa2c907e09007e83d6130f5e4b8e3911ee8c55c53af09abdb1a27f"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.406214 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" event={"ID":"3475dad0-af15-4cbe-b43c-640fcebd0873","Type":"ContainerStarted","Data":"01753ea7180520e88e6586afd9eee03954b9b50bd0cce4f8b7b80fae047a7b83"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.422911 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" event={"ID":"07103b35-ea08-4d06-b981-d04736a21d17","Type":"ContainerStarted","Data":"464f3ea0b76001a5a5ad3bc489a05c05f611dcd2c4699c9aff4fec9e7f0fedae"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.426711 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"45ea3d9af9ab75eff3bfa99615076aaf0e5fbce7d826f2a8d35a04917dec41ff"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.426733 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"656d3d37652b51b7e1f07b092be88b0064565e3c8e8c06aec68d972cb5d4ea3d"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.430904 4747 generic.go:334] "Generic (PLEG): container finished" podID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerID="a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f" exitCode=0 Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.431002 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzhtj" event={"ID":"61cf3ea9-c177-48e5-ab64-4300b58f4875","Type":"ContainerDied","Data":"a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.431019 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzhtj" event={"ID":"61cf3ea9-c177-48e5-ab64-4300b58f4875","Type":"ContainerStarted","Data":"e7e9c9982fdc4e47e2270f3f418e8dd3180c2c78bcb4503fa7d4d095866d01fa"} Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.444696 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.483795 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hv9wq"] Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.484834 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.497654 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv9wq"] Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.524007 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4p8pb" podStartSLOduration=11.52399194 podStartE2EDuration="11.52399194s" podCreationTimestamp="2025-09-30 18:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:31.519631457 +0000 UTC m=+151.179111581" watchObservedRunningTime="2025-09-30 18:48:31.52399194 +0000 UTC m=+151.183472054" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.546719 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqx26\" (UniqueName: \"kubernetes.io/projected/cd093df4-4739-4052-84df-edac578c053a-kube-api-access-tqx26\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.547030 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-utilities\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.547050 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-catalog-content\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.619671 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.620317 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.625146 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.625339 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.625557 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:31 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:31 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:31 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.625584 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.642877 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.648819 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqx26\" (UniqueName: \"kubernetes.io/projected/cd093df4-4739-4052-84df-edac578c053a-kube-api-access-tqx26\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.648909 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-utilities\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.648962 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-catalog-content\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.649431 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-catalog-content\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.649951 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-utilities\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.684038 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqx26\" (UniqueName: \"kubernetes.io/projected/cd093df4-4739-4052-84df-edac578c053a-kube-api-access-tqx26\") pod \"redhat-marketplace-hv9wq\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.750141 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36169e44-b2a1-4333-b002-d03b7d147842-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"36169e44-b2a1-4333-b002-d03b7d147842\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.750624 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36169e44-b2a1-4333-b002-d03b7d147842-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"36169e44-b2a1-4333-b002-d03b7d147842\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.812187 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.850368 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.851806 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36169e44-b2a1-4333-b002-d03b7d147842-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"36169e44-b2a1-4333-b002-d03b7d147842\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.851913 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36169e44-b2a1-4333-b002-d03b7d147842-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"36169e44-b2a1-4333-b002-d03b7d147842\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.851970 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36169e44-b2a1-4333-b002-d03b7d147842-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"36169e44-b2a1-4333-b002-d03b7d147842\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.870821 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36169e44-b2a1-4333-b002-d03b7d147842-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"36169e44-b2a1-4333-b002-d03b7d147842\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.890155 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vs5"] Sep 30 18:48:31 crc kubenswrapper[4747]: W0930 18:48:31.908734 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddabd124e_9bb4_49f6_86e2_1ab79d3e314e.slice/crio-6028f4ce71ce2d850e8dcc2882969f505add34a5d692ca50ec93dcaf42b8c872 WatchSource:0}: Error finding container 6028f4ce71ce2d850e8dcc2882969f505add34a5d692ca50ec93dcaf42b8c872: Status 404 returned error can't find the container with id 6028f4ce71ce2d850e8dcc2882969f505add34a5d692ca50ec93dcaf42b8c872 Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.953541 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-config-volume\") pod \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.953632 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-secret-volume\") pod \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.953760 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n98s2\" (UniqueName: \"kubernetes.io/projected/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-kube-api-access-n98s2\") pod \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\" (UID: \"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431\") " Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.954443 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b5cf81a-9d0e-4f3f-8596-5c1e17f87431" (UID: "6b5cf81a-9d0e-4f3f-8596-5c1e17f87431"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.955207 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.972543 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-kube-api-access-n98s2" (OuterVolumeSpecName: "kube-api-access-n98s2") pod "6b5cf81a-9d0e-4f3f-8596-5c1e17f87431" (UID: "6b5cf81a-9d0e-4f3f-8596-5c1e17f87431"). InnerVolumeSpecName "kube-api-access-n98s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:48:31 crc kubenswrapper[4747]: I0930 18:48:31.976379 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b5cf81a-9d0e-4f3f-8596-5c1e17f87431" (UID: "6b5cf81a-9d0e-4f3f-8596-5c1e17f87431"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.055863 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n98s2\" (UniqueName: \"kubernetes.io/projected/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-kube-api-access-n98s2\") on node \"crc\" DevicePath \"\"" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.056099 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.056109 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.070339 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv9wq"] Sep 30 18:48:32 crc kubenswrapper[4747]: W0930 18:48:32.095945 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd093df4_4739_4052_84df_edac578c053a.slice/crio-2ce369b0b66915a247cc05060bdef19f3170141f8a131621d11828c57321c8af WatchSource:0}: Error finding container 2ce369b0b66915a247cc05060bdef19f3170141f8a131621d11828c57321c8af: Status 404 returned error can't find the container with id 2ce369b0b66915a247cc05060bdef19f3170141f8a131621d11828c57321c8af Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.100354 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vnw6d"] Sep 30 18:48:32 crc kubenswrapper[4747]: E0930 18:48:32.102631 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5cf81a-9d0e-4f3f-8596-5c1e17f87431" containerName="collect-profiles" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.102670 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5cf81a-9d0e-4f3f-8596-5c1e17f87431" containerName="collect-profiles" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.103073 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5cf81a-9d0e-4f3f-8596-5c1e17f87431" containerName="collect-profiles" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.107977 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.110833 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vnw6d"] Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.116664 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.254502 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.265495 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-utilities\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.265591 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-catalog-content\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.265642 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksf6\" (UniqueName: \"kubernetes.io/projected/95d8b334-0a57-47a1-bfea-3e30f6527e13-kube-api-access-qksf6\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.373511 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qksf6\" (UniqueName: \"kubernetes.io/projected/95d8b334-0a57-47a1-bfea-3e30f6527e13-kube-api-access-qksf6\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.373627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-utilities\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.373778 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-catalog-content\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.374350 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-utilities\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.374490 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-catalog-content\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.394055 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qksf6\" (UniqueName: \"kubernetes.io/projected/95d8b334-0a57-47a1-bfea-3e30f6527e13-kube-api-access-qksf6\") pod \"redhat-operators-vnw6d\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.438599 4747 generic.go:334] "Generic (PLEG): container finished" podID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerID="0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae" exitCode=0 Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.439660 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vs5" event={"ID":"dabd124e-9bb4-49f6-86e2-1ab79d3e314e","Type":"ContainerDied","Data":"0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae"} Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.439695 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vs5" event={"ID":"dabd124e-9bb4-49f6-86e2-1ab79d3e314e","Type":"ContainerStarted","Data":"6028f4ce71ce2d850e8dcc2882969f505add34a5d692ca50ec93dcaf42b8c872"} Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.443828 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" event={"ID":"07103b35-ea08-4d06-b981-d04736a21d17","Type":"ContainerStarted","Data":"8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18"} Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.443904 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.446114 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.446664 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg" event={"ID":"6b5cf81a-9d0e-4f3f-8596-5c1e17f87431","Type":"ContainerDied","Data":"726cc1926c1df328c4721b3ca9480e07217c849d6878ce4d11442919760d5641"} Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.446708 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="726cc1926c1df328c4721b3ca9480e07217c849d6878ce4d11442919760d5641" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.448980 4747 generic.go:334] "Generic (PLEG): container finished" podID="cd093df4-4739-4052-84df-edac578c053a" containerID="8659397b42d31609a4bd7ad6706d1efb7aeeb54402724fe77d9496773eb84e4b" exitCode=0 Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.448972 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv9wq" event={"ID":"cd093df4-4739-4052-84df-edac578c053a","Type":"ContainerDied","Data":"8659397b42d31609a4bd7ad6706d1efb7aeeb54402724fe77d9496773eb84e4b"} Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.449070 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv9wq" event={"ID":"cd093df4-4739-4052-84df-edac578c053a","Type":"ContainerStarted","Data":"2ce369b0b66915a247cc05060bdef19f3170141f8a131621d11828c57321c8af"} Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.450809 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"36169e44-b2a1-4333-b002-d03b7d147842","Type":"ContainerStarted","Data":"e0a6a3357a65695fb1e47986dbb4bf45cf9c9d6b1e74ff1124225230583eafaf"} Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.481178 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x8f7n"] Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.482186 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.499049 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x8f7n"] Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.501385 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" podStartSLOduration=130.50132941 podStartE2EDuration="2m10.50132941s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:32.490509475 +0000 UTC m=+152.149989589" watchObservedRunningTime="2025-09-30 18:48:32.50132941 +0000 UTC m=+152.160809524" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.535868 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.576870 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-utilities\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.577150 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blv59\" (UniqueName: \"kubernetes.io/projected/edc0a7dd-970c-47f6-830b-27fbc0caaecb-kube-api-access-blv59\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.577283 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-catalog-content\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.623073 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:32 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:32 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:32 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.623489 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.679107 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blv59\" (UniqueName: \"kubernetes.io/projected/edc0a7dd-970c-47f6-830b-27fbc0caaecb-kube-api-access-blv59\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.679214 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-catalog-content\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.679239 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-utilities\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.679704 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-utilities\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.679919 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-catalog-content\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.703768 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blv59\" (UniqueName: \"kubernetes.io/projected/edc0a7dd-970c-47f6-830b-27fbc0caaecb-kube-api-access-blv59\") pod \"redhat-operators-x8f7n\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.801014 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.845776 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.846901 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.853947 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.854320 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.862307 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.983515 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:32 crc kubenswrapper[4747]: I0930 18:48:32.983598 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.084776 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.084861 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.084974 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.106472 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:33 crc kubenswrapper[4747]: W0930 18:48:33.132565 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedc0a7dd_970c_47f6_830b_27fbc0caaecb.slice/crio-63c654b85c2a339edae93bccc594ff5c36b177bf519948e9fab49df7cca80eb2 WatchSource:0}: Error finding container 63c654b85c2a339edae93bccc594ff5c36b177bf519948e9fab49df7cca80eb2: Status 404 returned error can't find the container with id 63c654b85c2a339edae93bccc594ff5c36b177bf519948e9fab49df7cca80eb2 Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.144078 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vnw6d"] Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.144116 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x8f7n"] Sep 30 18:48:33 crc kubenswrapper[4747]: W0930 18:48:33.151297 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d8b334_0a57_47a1_bfea_3e30f6527e13.slice/crio-9e664045c2b6a4b2213b646e8a33cd4f24919f306b6af3d0ee8f8dafe379d9ea WatchSource:0}: Error finding container 9e664045c2b6a4b2213b646e8a33cd4f24919f306b6af3d0ee8f8dafe379d9ea: Status 404 returned error can't find the container with id 9e664045c2b6a4b2213b646e8a33cd4f24919f306b6af3d0ee8f8dafe379d9ea Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.181799 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.465845 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.474047 4747 generic.go:334] "Generic (PLEG): container finished" podID="36169e44-b2a1-4333-b002-d03b7d147842" containerID="884cf1978e48242decea913650398d40fdab6a94b17f7c50ede0f66a260bf482" exitCode=0 Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.474188 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"36169e44-b2a1-4333-b002-d03b7d147842","Type":"ContainerDied","Data":"884cf1978e48242decea913650398d40fdab6a94b17f7c50ede0f66a260bf482"} Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.478079 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnw6d" event={"ID":"95d8b334-0a57-47a1-bfea-3e30f6527e13","Type":"ContainerStarted","Data":"9e664045c2b6a4b2213b646e8a33cd4f24919f306b6af3d0ee8f8dafe379d9ea"} Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.483598 4747 generic.go:334] "Generic (PLEG): container finished" podID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerID="dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d" exitCode=0 Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.483795 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f7n" event={"ID":"edc0a7dd-970c-47f6-830b-27fbc0caaecb","Type":"ContainerDied","Data":"dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d"} Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.483820 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f7n" event={"ID":"edc0a7dd-970c-47f6-830b-27fbc0caaecb","Type":"ContainerStarted","Data":"63c654b85c2a339edae93bccc594ff5c36b177bf519948e9fab49df7cca80eb2"} Sep 30 18:48:33 crc kubenswrapper[4747]: W0930 18:48:33.490070 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8fc172df_db4a_4bcc_8b41_d5fdbe7b7709.slice/crio-1439e22a1e97fc765a717049bbe5157c37c8b076d9f7bcba40c1f3b084f96f8c WatchSource:0}: Error finding container 1439e22a1e97fc765a717049bbe5157c37c8b076d9f7bcba40c1f3b084f96f8c: Status 404 returned error can't find the container with id 1439e22a1e97fc765a717049bbe5157c37c8b076d9f7bcba40c1f3b084f96f8c Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.588809 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.594524 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7jqj9" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.615783 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.615826 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.619153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.622310 4747 patch_prober.go:28] interesting pod/router-default-5444994796-6xh4b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 18:48:33 crc kubenswrapper[4747]: [-]has-synced failed: reason withheld Sep 30 18:48:33 crc kubenswrapper[4747]: [+]process-running ok Sep 30 18:48:33 crc kubenswrapper[4747]: healthz check failed Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.622362 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xh4b" podUID="0971ea8b-120a-4f19-85d8-a1f349d91c8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.623398 4747 patch_prober.go:28] interesting pod/console-f9d7485db-sn6w4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.623423 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sn6w4" podUID="5f90c236-a235-4782-8351-cad3bb90e3fa" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.693132 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-56pzb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.693457 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56pzb" podUID="0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.693283 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-56pzb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Sep 30 18:48:33 crc kubenswrapper[4747]: I0930 18:48:33.693766 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-56pzb" podUID="0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.197468 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.509834 4747 generic.go:334] "Generic (PLEG): container finished" podID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerID="f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee" exitCode=0 Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.509944 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnw6d" event={"ID":"95d8b334-0a57-47a1-bfea-3e30f6527e13","Type":"ContainerDied","Data":"f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee"} Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.519309 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709","Type":"ContainerStarted","Data":"eb5103660245f7a89685f730663c41ed64c15ab30424bfe9d09563e4b0cc9b60"} Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.519343 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709","Type":"ContainerStarted","Data":"1439e22a1e97fc765a717049bbe5157c37c8b076d9f7bcba40c1f3b084f96f8c"} Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.553751 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.553736 podStartE2EDuration="2.553736s" podCreationTimestamp="2025-09-30 18:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:48:34.552803119 +0000 UTC m=+154.212283253" watchObservedRunningTime="2025-09-30 18:48:34.553736 +0000 UTC m=+154.213216114" Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.623106 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.625213 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6xh4b" Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.805575 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.923696 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36169e44-b2a1-4333-b002-d03b7d147842-kubelet-dir\") pod \"36169e44-b2a1-4333-b002-d03b7d147842\" (UID: \"36169e44-b2a1-4333-b002-d03b7d147842\") " Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.923802 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36169e44-b2a1-4333-b002-d03b7d147842-kube-api-access\") pod \"36169e44-b2a1-4333-b002-d03b7d147842\" (UID: \"36169e44-b2a1-4333-b002-d03b7d147842\") " Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.925280 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36169e44-b2a1-4333-b002-d03b7d147842-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "36169e44-b2a1-4333-b002-d03b7d147842" (UID: "36169e44-b2a1-4333-b002-d03b7d147842"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:48:34 crc kubenswrapper[4747]: I0930 18:48:34.935434 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36169e44-b2a1-4333-b002-d03b7d147842-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "36169e44-b2a1-4333-b002-d03b7d147842" (UID: "36169e44-b2a1-4333-b002-d03b7d147842"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:48:35 crc kubenswrapper[4747]: I0930 18:48:35.025724 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36169e44-b2a1-4333-b002-d03b7d147842-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 18:48:35 crc kubenswrapper[4747]: I0930 18:48:35.025772 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36169e44-b2a1-4333-b002-d03b7d147842-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 18:48:35 crc kubenswrapper[4747]: I0930 18:48:35.539173 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"36169e44-b2a1-4333-b002-d03b7d147842","Type":"ContainerDied","Data":"e0a6a3357a65695fb1e47986dbb4bf45cf9c9d6b1e74ff1124225230583eafaf"} Sep 30 18:48:35 crc kubenswrapper[4747]: I0930 18:48:35.539236 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a6a3357a65695fb1e47986dbb4bf45cf9c9d6b1e74ff1124225230583eafaf" Sep 30 18:48:35 crc kubenswrapper[4747]: I0930 18:48:35.539201 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 18:48:35 crc kubenswrapper[4747]: I0930 18:48:35.548616 4747 generic.go:334] "Generic (PLEG): container finished" podID="8fc172df-db4a-4bcc-8b41-d5fdbe7b7709" containerID="eb5103660245f7a89685f730663c41ed64c15ab30424bfe9d09563e4b0cc9b60" exitCode=0 Sep 30 18:48:35 crc kubenswrapper[4747]: I0930 18:48:35.548691 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709","Type":"ContainerDied","Data":"eb5103660245f7a89685f730663c41ed64c15ab30424bfe9d09563e4b0cc9b60"} Sep 30 18:48:36 crc kubenswrapper[4747]: I0930 18:48:36.256456 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gbqbf" Sep 30 18:48:37 crc kubenswrapper[4747]: I0930 18:48:37.655904 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:48:37 crc kubenswrapper[4747]: I0930 18:48:37.656238 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:48:43 crc kubenswrapper[4747]: I0930 18:48:43.623331 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:43 crc kubenswrapper[4747]: I0930 18:48:43.631663 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:48:43 crc kubenswrapper[4747]: I0930 18:48:43.692976 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-56pzb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Sep 30 18:48:43 crc kubenswrapper[4747]: I0930 18:48:43.693129 4747 patch_prober.go:28] interesting pod/downloads-7954f5f757-56pzb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Sep 30 18:48:43 crc kubenswrapper[4747]: I0930 18:48:43.693213 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-56pzb" podUID="0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Sep 30 18:48:43 crc kubenswrapper[4747]: I0930 18:48:43.693129 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-56pzb" podUID="0a09cd88-1c38-40aa-9d86-bddf0e7c2fb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Sep 30 18:48:44 crc kubenswrapper[4747]: I0930 18:48:44.997213 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.099992 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kubelet-dir\") pod \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\" (UID: \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\") " Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.100094 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kube-api-access\") pod \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\" (UID: \"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709\") " Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.100186 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8fc172df-db4a-4bcc-8b41-d5fdbe7b7709" (UID: "8fc172df-db4a-4bcc-8b41-d5fdbe7b7709"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.100372 4747 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.108135 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8fc172df-db4a-4bcc-8b41-d5fdbe7b7709" (UID: "8fc172df-db4a-4bcc-8b41-d5fdbe7b7709"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.202458 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc172df-db4a-4bcc-8b41-d5fdbe7b7709-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.406244 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.414178 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8-metrics-certs\") pod \"network-metrics-daemon-fbzb6\" (UID: \"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8\") " pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.647501 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8fc172df-db4a-4bcc-8b41-d5fdbe7b7709","Type":"ContainerDied","Data":"1439e22a1e97fc765a717049bbe5157c37c8b076d9f7bcba40c1f3b084f96f8c"} Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.647542 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1439e22a1e97fc765a717049bbe5157c37c8b076d9f7bcba40c1f3b084f96f8c" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.647566 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 18:48:45 crc kubenswrapper[4747]: I0930 18:48:45.708209 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbzb6" Sep 30 18:48:50 crc kubenswrapper[4747]: I0930 18:48:50.615767 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:48:53 crc kubenswrapper[4747]: I0930 18:48:53.700797 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-56pzb" Sep 30 18:48:59 crc kubenswrapper[4747]: E0930 18:48:59.115878 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 18:48:59 crc kubenswrapper[4747]: E0930 18:48:59.116469 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m66z5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-x2vs5_openshift-marketplace(dabd124e-9bb4-49f6-86e2-1ab79d3e314e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 18:48:59 crc kubenswrapper[4747]: E0930 18:48:59.117782 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-x2vs5" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.532650 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fbzb6"] Sep 30 18:48:59 crc kubenswrapper[4747]: W0930 18:48:59.539259 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5619b0a1_efbb_4fdb_b08f_0ac5ba1bbdc8.slice/crio-8edf71d9f0e0f40ab182d96276eb61af192f445c60e6a563379a257bdb9809da WatchSource:0}: Error finding container 8edf71d9f0e0f40ab182d96276eb61af192f445c60e6a563379a257bdb9809da: Status 404 returned error can't find the container with id 8edf71d9f0e0f40ab182d96276eb61af192f445c60e6a563379a257bdb9809da Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.752050 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" event={"ID":"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8","Type":"ContainerStarted","Data":"8edf71d9f0e0f40ab182d96276eb61af192f445c60e6a563379a257bdb9809da"} Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.755199 4747 generic.go:334] "Generic (PLEG): container finished" podID="90237139-34d3-4f31-8b81-0b34418c19ff" containerID="fd5ed43bbd8f38ba32645b0f272a78b5efd5b59aa237bb9f22f9383b79a1f941" exitCode=0 Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.755267 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz555" event={"ID":"90237139-34d3-4f31-8b81-0b34418c19ff","Type":"ContainerDied","Data":"fd5ed43bbd8f38ba32645b0f272a78b5efd5b59aa237bb9f22f9383b79a1f941"} Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.761468 4747 generic.go:334] "Generic (PLEG): container finished" podID="cd093df4-4739-4052-84df-edac578c053a" containerID="7a82dc093612ffc3c045cbb98472e6c9d6e5a23420594c6e33c9f1e4dd8b1bbe" exitCode=0 Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.762010 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv9wq" event={"ID":"cd093df4-4739-4052-84df-edac578c053a","Type":"ContainerDied","Data":"7a82dc093612ffc3c045cbb98472e6c9d6e5a23420594c6e33c9f1e4dd8b1bbe"} Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.782689 4747 generic.go:334] "Generic (PLEG): container finished" podID="26c98ff0-f864-4301-9423-f037408bce18" containerID="1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054" exitCode=0 Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.782747 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84v6x" event={"ID":"26c98ff0-f864-4301-9423-f037408bce18","Type":"ContainerDied","Data":"1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054"} Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.786264 4747 generic.go:334] "Generic (PLEG): container finished" podID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerID="5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14" exitCode=0 Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.786817 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzhtj" event={"ID":"61cf3ea9-c177-48e5-ab64-4300b58f4875","Type":"ContainerDied","Data":"5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14"} Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.790362 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnw6d" event={"ID":"95d8b334-0a57-47a1-bfea-3e30f6527e13","Type":"ContainerStarted","Data":"a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad"} Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.798667 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f7n" event={"ID":"edc0a7dd-970c-47f6-830b-27fbc0caaecb","Type":"ContainerStarted","Data":"16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa"} Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.801288 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrfvq" event={"ID":"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042","Type":"ContainerDied","Data":"201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1"} Sep 30 18:48:59 crc kubenswrapper[4747]: I0930 18:48:59.801159 4747 generic.go:334] "Generic (PLEG): container finished" podID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerID="201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1" exitCode=0 Sep 30 18:48:59 crc kubenswrapper[4747]: E0930 18:48:59.813198 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-x2vs5" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.812242 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84v6x" event={"ID":"26c98ff0-f864-4301-9423-f037408bce18","Type":"ContainerStarted","Data":"104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.814842 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzhtj" event={"ID":"61cf3ea9-c177-48e5-ab64-4300b58f4875","Type":"ContainerStarted","Data":"d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.817893 4747 generic.go:334] "Generic (PLEG): container finished" podID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerID="a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad" exitCode=0 Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.817960 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnw6d" event={"ID":"95d8b334-0a57-47a1-bfea-3e30f6527e13","Type":"ContainerDied","Data":"a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.820913 4747 generic.go:334] "Generic (PLEG): container finished" podID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerID="16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa" exitCode=0 Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.820990 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f7n" event={"ID":"edc0a7dd-970c-47f6-830b-27fbc0caaecb","Type":"ContainerDied","Data":"16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.825763 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrfvq" event={"ID":"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042","Type":"ContainerStarted","Data":"2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.830528 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" event={"ID":"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8","Type":"ContainerStarted","Data":"6992660f9f975e3ce581b6e0a24eeceb16e83515de9c1781f920a7c4e195ad17"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.830578 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbzb6" event={"ID":"5619b0a1-efbb-4fdb-b08f-0ac5ba1bbdc8","Type":"ContainerStarted","Data":"39cf63cb279cd55d13d834b3ac1616b879a376e67a1c330d8886622768946486"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.838987 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz555" event={"ID":"90237139-34d3-4f31-8b81-0b34418c19ff","Type":"ContainerStarted","Data":"556ed84891e56efb83d9fd3e50d87e58f664bf0c66ef0dbc53340643a210cb47"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.843048 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv9wq" event={"ID":"cd093df4-4739-4052-84df-edac578c053a","Type":"ContainerStarted","Data":"a104763667a7cc96f4f5e2bb863038b3e203f4d8785ded97da6c54d53b64340b"} Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.851388 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84v6x" podStartSLOduration=3.897301208 podStartE2EDuration="32.851372396s" podCreationTimestamp="2025-09-30 18:48:28 +0000 UTC" firstStartedPulling="2025-09-30 18:48:31.362563912 +0000 UTC m=+151.022044016" lastFinishedPulling="2025-09-30 18:49:00.31663509 +0000 UTC m=+179.976115204" observedRunningTime="2025-09-30 18:49:00.850126915 +0000 UTC m=+180.509607029" watchObservedRunningTime="2025-09-30 18:49:00.851372396 +0000 UTC m=+180.510852510" Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.879980 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jz555" podStartSLOduration=2.742581792 podStartE2EDuration="31.879960614s" podCreationTimestamp="2025-09-30 18:48:29 +0000 UTC" firstStartedPulling="2025-09-30 18:48:31.396975772 +0000 UTC m=+151.056455886" lastFinishedPulling="2025-09-30 18:49:00.534354594 +0000 UTC m=+180.193834708" observedRunningTime="2025-09-30 18:49:00.877026828 +0000 UTC m=+180.536506942" watchObservedRunningTime="2025-09-30 18:49:00.879960614 +0000 UTC m=+180.539440728" Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.900848 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hv9wq" podStartSLOduration=1.913133673 podStartE2EDuration="29.900826109s" podCreationTimestamp="2025-09-30 18:48:31 +0000 UTC" firstStartedPulling="2025-09-30 18:48:32.450638307 +0000 UTC m=+152.110118421" lastFinishedPulling="2025-09-30 18:49:00.438330743 +0000 UTC m=+180.097810857" observedRunningTime="2025-09-30 18:49:00.895044879 +0000 UTC m=+180.554525003" watchObservedRunningTime="2025-09-30 18:49:00.900826109 +0000 UTC m=+180.560306233" Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.919152 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fbzb6" podStartSLOduration=158.9191282 podStartE2EDuration="2m38.9191282s" podCreationTimestamp="2025-09-30 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:49:00.916728741 +0000 UTC m=+180.576208865" watchObservedRunningTime="2025-09-30 18:49:00.9191282 +0000 UTC m=+180.578608324" Sep 30 18:49:00 crc kubenswrapper[4747]: I0930 18:49:00.980561 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mzhtj" podStartSLOduration=3.105908775 podStartE2EDuration="31.980529585s" podCreationTimestamp="2025-09-30 18:48:29 +0000 UTC" firstStartedPulling="2025-09-30 18:48:31.434882046 +0000 UTC m=+151.094362160" lastFinishedPulling="2025-09-30 18:49:00.309502856 +0000 UTC m=+179.968982970" observedRunningTime="2025-09-30 18:49:00.97764427 +0000 UTC m=+180.637124394" watchObservedRunningTime="2025-09-30 18:49:00.980529585 +0000 UTC m=+180.640009699" Sep 30 18:49:01 crc kubenswrapper[4747]: I0930 18:49:01.001231 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qrfvq" podStartSLOduration=2.104677569 podStartE2EDuration="32.001209213s" podCreationTimestamp="2025-09-30 18:48:29 +0000 UTC" firstStartedPulling="2025-09-30 18:48:30.33674002 +0000 UTC m=+149.996220134" lastFinishedPulling="2025-09-30 18:49:00.233271624 +0000 UTC m=+179.892751778" observedRunningTime="2025-09-30 18:49:00.999396464 +0000 UTC m=+180.658876578" watchObservedRunningTime="2025-09-30 18:49:01.001209213 +0000 UTC m=+180.660689337" Sep 30 18:49:01 crc kubenswrapper[4747]: I0930 18:49:01.814204 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:49:01 crc kubenswrapper[4747]: I0930 18:49:01.814761 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:49:01 crc kubenswrapper[4747]: I0930 18:49:01.850175 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f7n" event={"ID":"edc0a7dd-970c-47f6-830b-27fbc0caaecb","Type":"ContainerStarted","Data":"44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359"} Sep 30 18:49:01 crc kubenswrapper[4747]: I0930 18:49:01.853321 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnw6d" event={"ID":"95d8b334-0a57-47a1-bfea-3e30f6527e13","Type":"ContainerStarted","Data":"8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9"} Sep 30 18:49:01 crc kubenswrapper[4747]: I0930 18:49:01.889583 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vnw6d" podStartSLOduration=2.824095732 podStartE2EDuration="29.889562385s" podCreationTimestamp="2025-09-30 18:48:32 +0000 UTC" firstStartedPulling="2025-09-30 18:48:34.51566233 +0000 UTC m=+154.175142434" lastFinishedPulling="2025-09-30 18:49:01.581128973 +0000 UTC m=+181.240609087" observedRunningTime="2025-09-30 18:49:01.88546345 +0000 UTC m=+181.544943574" watchObservedRunningTime="2025-09-30 18:49:01.889562385 +0000 UTC m=+181.549042509" Sep 30 18:49:01 crc kubenswrapper[4747]: I0930 18:49:01.890072 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x8f7n" podStartSLOduration=2.058814329 podStartE2EDuration="29.890064691s" podCreationTimestamp="2025-09-30 18:48:32 +0000 UTC" firstStartedPulling="2025-09-30 18:48:33.486701225 +0000 UTC m=+153.146181339" lastFinishedPulling="2025-09-30 18:49:01.317951577 +0000 UTC m=+180.977431701" observedRunningTime="2025-09-30 18:49:01.870875011 +0000 UTC m=+181.530355125" watchObservedRunningTime="2025-09-30 18:49:01.890064691 +0000 UTC m=+181.549544825" Sep 30 18:49:01 crc kubenswrapper[4747]: I0930 18:49:01.988489 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:49:02 crc kubenswrapper[4747]: I0930 18:49:02.536970 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:49:02 crc kubenswrapper[4747]: I0930 18:49:02.537349 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:49:02 crc kubenswrapper[4747]: I0930 18:49:02.801238 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:49:02 crc kubenswrapper[4747]: I0930 18:49:02.801286 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:49:03 crc kubenswrapper[4747]: I0930 18:49:03.574817 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vnw6d" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="registry-server" probeResult="failure" output=< Sep 30 18:49:03 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Sep 30 18:49:03 crc kubenswrapper[4747]: > Sep 30 18:49:03 crc kubenswrapper[4747]: I0930 18:49:03.859434 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x8f7n" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="registry-server" probeResult="failure" output=< Sep 30 18:49:03 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Sep 30 18:49:03 crc kubenswrapper[4747]: > Sep 30 18:49:04 crc kubenswrapper[4747]: I0930 18:49:04.051283 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6t8n" Sep 30 18:49:07 crc kubenswrapper[4747]: I0930 18:49:07.655617 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:49:07 crc kubenswrapper[4747]: I0930 18:49:07.656150 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.213375 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.213462 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.266165 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.422112 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.422222 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.486321 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.717390 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.717488 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.788249 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.909308 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.909569 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.991752 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:49:09 crc kubenswrapper[4747]: I0930 18:49:09.993018 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:49:10 crc kubenswrapper[4747]: I0930 18:49:10.000743 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:49:10 crc kubenswrapper[4747]: I0930 18:49:10.002504 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:49:10 crc kubenswrapper[4747]: I0930 18:49:10.121852 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 18:49:10 crc kubenswrapper[4747]: I0930 18:49:10.908213 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mzhtj"] Sep 30 18:49:10 crc kubenswrapper[4747]: I0930 18:49:10.989162 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:49:11 crc kubenswrapper[4747]: I0930 18:49:11.908756 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:49:11 crc kubenswrapper[4747]: I0930 18:49:11.942670 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mzhtj" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerName="registry-server" containerID="cri-o://d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa" gracePeriod=2 Sep 30 18:49:12 crc kubenswrapper[4747]: I0930 18:49:12.310985 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jz555"] Sep 30 18:49:12 crc kubenswrapper[4747]: I0930 18:49:12.615754 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:49:12 crc kubenswrapper[4747]: I0930 18:49:12.685907 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:49:12 crc kubenswrapper[4747]: E0930 18:49:12.783311 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61cf3ea9_c177_48e5_ab64_4300b58f4875.slice/crio-d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa.scope\": RecentStats: unable to find data in memory cache]" Sep 30 18:49:12 crc kubenswrapper[4747]: I0930 18:49:12.869869 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:49:12 crc kubenswrapper[4747]: I0930 18:49:12.930482 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:49:12 crc kubenswrapper[4747]: I0930 18:49:12.949114 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jz555" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" containerName="registry-server" containerID="cri-o://556ed84891e56efb83d9fd3e50d87e58f664bf0c66ef0dbc53340643a210cb47" gracePeriod=2 Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.829037 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.943014 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-utilities\") pod \"61cf3ea9-c177-48e5-ab64-4300b58f4875\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.943108 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnwzx\" (UniqueName: \"kubernetes.io/projected/61cf3ea9-c177-48e5-ab64-4300b58f4875-kube-api-access-fnwzx\") pod \"61cf3ea9-c177-48e5-ab64-4300b58f4875\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.943138 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-catalog-content\") pod \"61cf3ea9-c177-48e5-ab64-4300b58f4875\" (UID: \"61cf3ea9-c177-48e5-ab64-4300b58f4875\") " Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.945354 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-utilities" (OuterVolumeSpecName: "utilities") pod "61cf3ea9-c177-48e5-ab64-4300b58f4875" (UID: "61cf3ea9-c177-48e5-ab64-4300b58f4875"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.965946 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61cf3ea9-c177-48e5-ab64-4300b58f4875-kube-api-access-fnwzx" (OuterVolumeSpecName: "kube-api-access-fnwzx") pod "61cf3ea9-c177-48e5-ab64-4300b58f4875" (UID: "61cf3ea9-c177-48e5-ab64-4300b58f4875"). InnerVolumeSpecName "kube-api-access-fnwzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.966562 4747 generic.go:334] "Generic (PLEG): container finished" podID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerID="d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa" exitCode=0 Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.966643 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzhtj" event={"ID":"61cf3ea9-c177-48e5-ab64-4300b58f4875","Type":"ContainerDied","Data":"d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa"} Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.966670 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzhtj" event={"ID":"61cf3ea9-c177-48e5-ab64-4300b58f4875","Type":"ContainerDied","Data":"e7e9c9982fdc4e47e2270f3f418e8dd3180c2c78bcb4503fa7d4d095866d01fa"} Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.966709 4747 scope.go:117] "RemoveContainer" containerID="d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa" Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.966839 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzhtj" Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.972047 4747 generic.go:334] "Generic (PLEG): container finished" podID="90237139-34d3-4f31-8b81-0b34418c19ff" containerID="556ed84891e56efb83d9fd3e50d87e58f664bf0c66ef0dbc53340643a210cb47" exitCode=0 Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.972082 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz555" event={"ID":"90237139-34d3-4f31-8b81-0b34418c19ff","Type":"ContainerDied","Data":"556ed84891e56efb83d9fd3e50d87e58f664bf0c66ef0dbc53340643a210cb47"} Sep 30 18:49:13 crc kubenswrapper[4747]: I0930 18:49:13.982973 4747 scope.go:117] "RemoveContainer" containerID="5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.004662 4747 scope.go:117] "RemoveContainer" containerID="a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.014686 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61cf3ea9-c177-48e5-ab64-4300b58f4875" (UID: "61cf3ea9-c177-48e5-ab64-4300b58f4875"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.031669 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.036165 4747 scope.go:117] "RemoveContainer" containerID="d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa" Sep 30 18:49:14 crc kubenswrapper[4747]: E0930 18:49:14.037497 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa\": container with ID starting with d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa not found: ID does not exist" containerID="d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.037527 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa"} err="failed to get container status \"d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa\": rpc error: code = NotFound desc = could not find container \"d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa\": container with ID starting with d40a201e02884eb3fc2b5cf2c2714639f269d8dfb4167f211538b03b7e80bdfa not found: ID does not exist" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.037564 4747 scope.go:117] "RemoveContainer" containerID="5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14" Sep 30 18:49:14 crc kubenswrapper[4747]: E0930 18:49:14.037976 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14\": container with ID starting with 5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14 not found: ID does not exist" containerID="5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.038139 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14"} err="failed to get container status \"5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14\": rpc error: code = NotFound desc = could not find container \"5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14\": container with ID starting with 5caf54440cb97d32a6cd8cb59d7e53dec3d5c2831afd9f2dcad9046103bebc14 not found: ID does not exist" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.038235 4747 scope.go:117] "RemoveContainer" containerID="a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f" Sep 30 18:49:14 crc kubenswrapper[4747]: E0930 18:49:14.038653 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f\": container with ID starting with a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f not found: ID does not exist" containerID="a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.038677 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f"} err="failed to get container status \"a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f\": rpc error: code = NotFound desc = could not find container \"a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f\": container with ID starting with a5a96e341c9983fddac32a3a0bf91ad38674039afaa491b4744b145d2934062f not found: ID does not exist" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.044957 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnwzx\" (UniqueName: \"kubernetes.io/projected/61cf3ea9-c177-48e5-ab64-4300b58f4875-kube-api-access-fnwzx\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.044991 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.045019 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61cf3ea9-c177-48e5-ab64-4300b58f4875-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.145700 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfhjt\" (UniqueName: \"kubernetes.io/projected/90237139-34d3-4f31-8b81-0b34418c19ff-kube-api-access-gfhjt\") pod \"90237139-34d3-4f31-8b81-0b34418c19ff\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.145857 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-catalog-content\") pod \"90237139-34d3-4f31-8b81-0b34418c19ff\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.145888 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-utilities\") pod \"90237139-34d3-4f31-8b81-0b34418c19ff\" (UID: \"90237139-34d3-4f31-8b81-0b34418c19ff\") " Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.147269 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-utilities" (OuterVolumeSpecName: "utilities") pod "90237139-34d3-4f31-8b81-0b34418c19ff" (UID: "90237139-34d3-4f31-8b81-0b34418c19ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.150412 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90237139-34d3-4f31-8b81-0b34418c19ff-kube-api-access-gfhjt" (OuterVolumeSpecName: "kube-api-access-gfhjt") pod "90237139-34d3-4f31-8b81-0b34418c19ff" (UID: "90237139-34d3-4f31-8b81-0b34418c19ff"). InnerVolumeSpecName "kube-api-access-gfhjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.187545 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90237139-34d3-4f31-8b81-0b34418c19ff" (UID: "90237139-34d3-4f31-8b81-0b34418c19ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.248050 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.248101 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90237139-34d3-4f31-8b81-0b34418c19ff-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.248119 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfhjt\" (UniqueName: \"kubernetes.io/projected/90237139-34d3-4f31-8b81-0b34418c19ff-kube-api-access-gfhjt\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.325438 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mzhtj"] Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.329543 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mzhtj"] Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.710235 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv9wq"] Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.710618 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hv9wq" podUID="cd093df4-4739-4052-84df-edac578c053a" containerName="registry-server" containerID="cri-o://a104763667a7cc96f4f5e2bb863038b3e203f4d8785ded97da6c54d53b64340b" gracePeriod=2 Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.982115 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jz555" event={"ID":"90237139-34d3-4f31-8b81-0b34418c19ff","Type":"ContainerDied","Data":"cae0ec302efa2c907e09007e83d6130f5e4b8e3911ee8c55c53af09abdb1a27f"} Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.982517 4747 scope.go:117] "RemoveContainer" containerID="556ed84891e56efb83d9fd3e50d87e58f664bf0c66ef0dbc53340643a210cb47" Sep 30 18:49:14 crc kubenswrapper[4747]: I0930 18:49:14.982708 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jz555" Sep 30 18:49:15 crc kubenswrapper[4747]: I0930 18:49:15.012753 4747 scope.go:117] "RemoveContainer" containerID="fd5ed43bbd8f38ba32645b0f272a78b5efd5b59aa237bb9f22f9383b79a1f941" Sep 30 18:49:15 crc kubenswrapper[4747]: I0930 18:49:15.037387 4747 scope.go:117] "RemoveContainer" containerID="9200cc3d58e817219c087aa3b84deaf5fce43f6ec5b92276893f0d34ee77f4c4" Sep 30 18:49:15 crc kubenswrapper[4747]: I0930 18:49:15.037428 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jz555"] Sep 30 18:49:15 crc kubenswrapper[4747]: I0930 18:49:15.042679 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jz555"] Sep 30 18:49:15 crc kubenswrapper[4747]: I0930 18:49:15.095642 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" path="/var/lib/kubelet/pods/61cf3ea9-c177-48e5-ab64-4300b58f4875/volumes" Sep 30 18:49:15 crc kubenswrapper[4747]: I0930 18:49:15.096892 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" path="/var/lib/kubelet/pods/90237139-34d3-4f31-8b81-0b34418c19ff/volumes" Sep 30 18:49:15 crc kubenswrapper[4747]: I0930 18:49:15.998027 4747 generic.go:334] "Generic (PLEG): container finished" podID="cd093df4-4739-4052-84df-edac578c053a" containerID="a104763667a7cc96f4f5e2bb863038b3e203f4d8785ded97da6c54d53b64340b" exitCode=0 Sep 30 18:49:15 crc kubenswrapper[4747]: I0930 18:49:15.998116 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv9wq" event={"ID":"cd093df4-4739-4052-84df-edac578c053a","Type":"ContainerDied","Data":"a104763667a7cc96f4f5e2bb863038b3e203f4d8785ded97da6c54d53b64340b"} Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.254855 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.382385 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-catalog-content\") pod \"cd093df4-4739-4052-84df-edac578c053a\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.382441 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqx26\" (UniqueName: \"kubernetes.io/projected/cd093df4-4739-4052-84df-edac578c053a-kube-api-access-tqx26\") pod \"cd093df4-4739-4052-84df-edac578c053a\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.382587 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-utilities\") pod \"cd093df4-4739-4052-84df-edac578c053a\" (UID: \"cd093df4-4739-4052-84df-edac578c053a\") " Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.383603 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-utilities" (OuterVolumeSpecName: "utilities") pod "cd093df4-4739-4052-84df-edac578c053a" (UID: "cd093df4-4739-4052-84df-edac578c053a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.395519 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd093df4-4739-4052-84df-edac578c053a" (UID: "cd093df4-4739-4052-84df-edac578c053a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.395883 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd093df4-4739-4052-84df-edac578c053a-kube-api-access-tqx26" (OuterVolumeSpecName: "kube-api-access-tqx26") pod "cd093df4-4739-4052-84df-edac578c053a" (UID: "cd093df4-4739-4052-84df-edac578c053a"). InnerVolumeSpecName "kube-api-access-tqx26". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.483966 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.484001 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd093df4-4739-4052-84df-edac578c053a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:16 crc kubenswrapper[4747]: I0930 18:49:16.484048 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqx26\" (UniqueName: \"kubernetes.io/projected/cd093df4-4739-4052-84df-edac578c053a-kube-api-access-tqx26\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.009220 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv9wq" event={"ID":"cd093df4-4739-4052-84df-edac578c053a","Type":"ContainerDied","Data":"2ce369b0b66915a247cc05060bdef19f3170141f8a131621d11828c57321c8af"} Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.009286 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hv9wq" Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.009304 4747 scope.go:117] "RemoveContainer" containerID="a104763667a7cc96f4f5e2bb863038b3e203f4d8785ded97da6c54d53b64340b" Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.012306 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vs5" event={"ID":"dabd124e-9bb4-49f6-86e2-1ab79d3e314e","Type":"ContainerStarted","Data":"b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed"} Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.046319 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv9wq"] Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.048841 4747 scope.go:117] "RemoveContainer" containerID="7a82dc093612ffc3c045cbb98472e6c9d6e5a23420594c6e33c9f1e4dd8b1bbe" Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.053848 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv9wq"] Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.103421 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd093df4-4739-4052-84df-edac578c053a" path="/var/lib/kubelet/pods/cd093df4-4739-4052-84df-edac578c053a/volumes" Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.107706 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x8f7n"] Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.108069 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x8f7n" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="registry-server" containerID="cri-o://44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359" gracePeriod=2 Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.118102 4747 scope.go:117] "RemoveContainer" containerID="8659397b42d31609a4bd7ad6706d1efb7aeeb54402724fe77d9496773eb84e4b" Sep 30 18:49:17 crc kubenswrapper[4747]: I0930 18:49:17.979833 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.019784 4747 generic.go:334] "Generic (PLEG): container finished" podID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerID="44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359" exitCode=0 Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.019899 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x8f7n" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.020122 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f7n" event={"ID":"edc0a7dd-970c-47f6-830b-27fbc0caaecb","Type":"ContainerDied","Data":"44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359"} Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.020184 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x8f7n" event={"ID":"edc0a7dd-970c-47f6-830b-27fbc0caaecb","Type":"ContainerDied","Data":"63c654b85c2a339edae93bccc594ff5c36b177bf519948e9fab49df7cca80eb2"} Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.020220 4747 scope.go:117] "RemoveContainer" containerID="44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.029729 4747 generic.go:334] "Generic (PLEG): container finished" podID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerID="b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed" exitCode=0 Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.029795 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vs5" event={"ID":"dabd124e-9bb4-49f6-86e2-1ab79d3e314e","Type":"ContainerDied","Data":"b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed"} Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.066040 4747 scope.go:117] "RemoveContainer" containerID="16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.088317 4747 scope.go:117] "RemoveContainer" containerID="dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.106022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-utilities\") pod \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.106112 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blv59\" (UniqueName: \"kubernetes.io/projected/edc0a7dd-970c-47f6-830b-27fbc0caaecb-kube-api-access-blv59\") pod \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.106189 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-catalog-content\") pod \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\" (UID: \"edc0a7dd-970c-47f6-830b-27fbc0caaecb\") " Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.107107 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-utilities" (OuterVolumeSpecName: "utilities") pod "edc0a7dd-970c-47f6-830b-27fbc0caaecb" (UID: "edc0a7dd-970c-47f6-830b-27fbc0caaecb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.113957 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc0a7dd-970c-47f6-830b-27fbc0caaecb-kube-api-access-blv59" (OuterVolumeSpecName: "kube-api-access-blv59") pod "edc0a7dd-970c-47f6-830b-27fbc0caaecb" (UID: "edc0a7dd-970c-47f6-830b-27fbc0caaecb"). InnerVolumeSpecName "kube-api-access-blv59". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.117133 4747 scope.go:117] "RemoveContainer" containerID="44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359" Sep 30 18:49:18 crc kubenswrapper[4747]: E0930 18:49:18.117552 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359\": container with ID starting with 44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359 not found: ID does not exist" containerID="44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.117611 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359"} err="failed to get container status \"44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359\": rpc error: code = NotFound desc = could not find container \"44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359\": container with ID starting with 44682ffa92ec2fd6d20b8e88291062da7f5a411cc5ae129bdda0268a1f0fc359 not found: ID does not exist" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.117651 4747 scope.go:117] "RemoveContainer" containerID="16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa" Sep 30 18:49:18 crc kubenswrapper[4747]: E0930 18:49:18.118326 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa\": container with ID starting with 16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa not found: ID does not exist" containerID="16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.118400 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa"} err="failed to get container status \"16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa\": rpc error: code = NotFound desc = could not find container \"16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa\": container with ID starting with 16f12bfca289629a63069a0ab219acf08e3dafba789affddd6a0559fb98369aa not found: ID does not exist" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.118427 4747 scope.go:117] "RemoveContainer" containerID="dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d" Sep 30 18:49:18 crc kubenswrapper[4747]: E0930 18:49:18.124103 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d\": container with ID starting with dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d not found: ID does not exist" containerID="dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.124157 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d"} err="failed to get container status \"dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d\": rpc error: code = NotFound desc = could not find container \"dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d\": container with ID starting with dc80aec50ef9d90cae5abb4e6c8f77fd08ef98e9af0078da53f096adb3cf580d not found: ID does not exist" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.183081 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edc0a7dd-970c-47f6-830b-27fbc0caaecb" (UID: "edc0a7dd-970c-47f6-830b-27fbc0caaecb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.209335 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.209422 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blv59\" (UniqueName: \"kubernetes.io/projected/edc0a7dd-970c-47f6-830b-27fbc0caaecb-kube-api-access-blv59\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.209446 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0a7dd-970c-47f6-830b-27fbc0caaecb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.387339 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x8f7n"] Sep 30 18:49:18 crc kubenswrapper[4747]: I0930 18:49:18.392538 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x8f7n"] Sep 30 18:49:19 crc kubenswrapper[4747]: I0930 18:49:19.045321 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vs5" event={"ID":"dabd124e-9bb4-49f6-86e2-1ab79d3e314e","Type":"ContainerStarted","Data":"ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0"} Sep 30 18:49:19 crc kubenswrapper[4747]: I0930 18:49:19.073168 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2vs5" podStartSLOduration=1.907416209 podStartE2EDuration="48.073141295s" podCreationTimestamp="2025-09-30 18:48:31 +0000 UTC" firstStartedPulling="2025-09-30 18:48:32.441333541 +0000 UTC m=+152.100813655" lastFinishedPulling="2025-09-30 18:49:18.607058617 +0000 UTC m=+198.266538741" observedRunningTime="2025-09-30 18:49:19.071837504 +0000 UTC m=+198.731317668" watchObservedRunningTime="2025-09-30 18:49:19.073141295 +0000 UTC m=+198.732621429" Sep 30 18:49:19 crc kubenswrapper[4747]: I0930 18:49:19.097689 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" path="/var/lib/kubelet/pods/edc0a7dd-970c-47f6-830b-27fbc0caaecb/volumes" Sep 30 18:49:21 crc kubenswrapper[4747]: I0930 18:49:21.445153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:49:21 crc kubenswrapper[4747]: I0930 18:49:21.446388 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:49:21 crc kubenswrapper[4747]: I0930 18:49:21.498276 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:49:23 crc kubenswrapper[4747]: I0930 18:49:23.178586 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:49:37 crc kubenswrapper[4747]: I0930 18:49:37.655396 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:49:37 crc kubenswrapper[4747]: I0930 18:49:37.655898 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:49:37 crc kubenswrapper[4747]: I0930 18:49:37.655966 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:49:37 crc kubenswrapper[4747]: I0930 18:49:37.656490 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:49:37 crc kubenswrapper[4747]: I0930 18:49:37.656533 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9" gracePeriod=600 Sep 30 18:49:38 crc kubenswrapper[4747]: I0930 18:49:38.212272 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9" exitCode=0 Sep 30 18:49:38 crc kubenswrapper[4747]: I0930 18:49:38.212321 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9"} Sep 30 18:49:38 crc kubenswrapper[4747]: I0930 18:49:38.212867 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"58baf4587e29954a036ee1a983f6ca3f3c55174a0a0fbc62706e1e634f34fbef"} Sep 30 18:49:52 crc kubenswrapper[4747]: I0930 18:49:52.613178 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kzs26"] Sep 30 18:50:17 crc kubenswrapper[4747]: I0930 18:50:17.655588 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" podUID="e53e2da5-c4a6-42ae-a59b-a064f1c8756b" containerName="oauth-openshift" containerID="cri-o://8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3" gracePeriod=15 Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.143999 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189311 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-666545c866-g5wls"] Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189643 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189710 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189726 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc172df-db4a-4bcc-8b41-d5fdbe7b7709" containerName="pruner" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189734 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc172df-db4a-4bcc-8b41-d5fdbe7b7709" containerName="pruner" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189745 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53e2da5-c4a6-42ae-a59b-a064f1c8756b" containerName="oauth-openshift" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189824 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53e2da5-c4a6-42ae-a59b-a064f1c8756b" containerName="oauth-openshift" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189833 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd093df4-4739-4052-84df-edac578c053a" containerName="extract-content" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189840 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd093df4-4739-4052-84df-edac578c053a" containerName="extract-content" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189852 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="extract-content" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189858 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="extract-content" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189869 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerName="extract-utilities" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189877 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerName="extract-utilities" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189886 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd093df4-4739-4052-84df-edac578c053a" containerName="extract-utilities" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189893 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd093df4-4739-4052-84df-edac578c053a" containerName="extract-utilities" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189904 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189911 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189918 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189947 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189954 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" containerName="extract-utilities" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189960 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" containerName="extract-utilities" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189969 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerName="extract-content" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189975 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerName="extract-content" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189984 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="extract-utilities" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.189991 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="extract-utilities" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.189996 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" containerName="extract-content" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190002 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" containerName="extract-content" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.190015 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd093df4-4739-4052-84df-edac578c053a" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190021 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd093df4-4739-4052-84df-edac578c053a" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.190029 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36169e44-b2a1-4333-b002-d03b7d147842" containerName="pruner" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190039 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="36169e44-b2a1-4333-b002-d03b7d147842" containerName="pruner" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190172 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc0a7dd-970c-47f6-830b-27fbc0caaecb" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190183 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="90237139-34d3-4f31-8b81-0b34418c19ff" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190192 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="36169e44-b2a1-4333-b002-d03b7d147842" containerName="pruner" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190201 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd093df4-4739-4052-84df-edac578c053a" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190213 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53e2da5-c4a6-42ae-a59b-a064f1c8756b" containerName="oauth-openshift" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190220 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="61cf3ea9-c177-48e5-ab64-4300b58f4875" containerName="registry-server" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190233 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc172df-db4a-4bcc-8b41-d5fdbe7b7709" containerName="pruner" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.190848 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.199289 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-666545c866-g5wls"] Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.223679 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsp57\" (UniqueName: \"kubernetes.io/projected/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-kube-api-access-nsp57\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.223744 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-dir\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.223807 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-trusted-ca-bundle\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.223861 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-login\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.223900 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-serving-cert\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.223968 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-router-certs\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.223957 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.224011 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-idp-0-file-data\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.224054 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-session\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.224098 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-provider-selection\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.224135 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-error\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.224173 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-service-ca\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.224208 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-policies\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.224246 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-cliconfig\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.225198 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.225331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-ocp-branding-template\") pod \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\" (UID: \"e53e2da5-c4a6-42ae-a59b-a064f1c8756b\") " Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.225873 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.226035 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.226425 4747 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.227525 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.232648 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.240359 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.240649 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.241116 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.241617 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.245607 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-kube-api-access-nsp57" (OuterVolumeSpecName: "kube-api-access-nsp57") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "kube-api-access-nsp57". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.245625 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.247413 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.247547 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.247788 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e53e2da5-c4a6-42ae-a59b-a064f1c8756b" (UID: "e53e2da5-c4a6-42ae-a59b-a064f1c8756b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.327634 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.327724 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.327766 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-service-ca\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.327819 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52e6a36f-4b46-4e29-aa4d-101eae89c828-audit-dir\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.327913 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-session\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.327975 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-router-certs\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328002 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-audit-policies\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328028 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-782fj\" (UniqueName: \"kubernetes.io/projected/52e6a36f-4b46-4e29-aa4d-101eae89c828-kube-api-access-782fj\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328057 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328291 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-cliconfig\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328400 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-error\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328482 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-login\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328519 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-serving-cert\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328586 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328680 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328721 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328755 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328776 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328799 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328824 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328851 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328876 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328895 4747 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328919 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.328984 4747 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.329008 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsp57\" (UniqueName: \"kubernetes.io/projected/e53e2da5-c4a6-42ae-a59b-a064f1c8756b-kube-api-access-nsp57\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.430725 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-session\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.430797 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-router-certs\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.430840 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-audit-policies\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.430891 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-782fj\" (UniqueName: \"kubernetes.io/projected/52e6a36f-4b46-4e29-aa4d-101eae89c828-kube-api-access-782fj\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.430986 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431135 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-cliconfig\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431235 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-error\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431299 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-login\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431349 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-serving-cert\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431495 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431551 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431605 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-service-ca\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.431660 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52e6a36f-4b46-4e29-aa4d-101eae89c828-audit-dir\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.432573 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-audit-policies\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.432694 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-cliconfig\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.433319 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/52e6a36f-4b46-4e29-aa4d-101eae89c828-audit-dir\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.433641 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-service-ca\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.433815 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.435489 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-router-certs\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.437982 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-error\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.438115 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-login\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.438113 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-session\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.439172 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.439600 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.440124 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-system-serving-cert\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.441049 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/52e6a36f-4b46-4e29-aa4d-101eae89c828-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.462389 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-782fj\" (UniqueName: \"kubernetes.io/projected/52e6a36f-4b46-4e29-aa4d-101eae89c828-kube-api-access-782fj\") pod \"oauth-openshift-666545c866-g5wls\" (UID: \"52e6a36f-4b46-4e29-aa4d-101eae89c828\") " pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.462917 4747 generic.go:334] "Generic (PLEG): container finished" podID="e53e2da5-c4a6-42ae-a59b-a064f1c8756b" containerID="8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3" exitCode=0 Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.463006 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" event={"ID":"e53e2da5-c4a6-42ae-a59b-a064f1c8756b","Type":"ContainerDied","Data":"8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3"} Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.463141 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" event={"ID":"e53e2da5-c4a6-42ae-a59b-a064f1c8756b","Type":"ContainerDied","Data":"def72629eb1b2f26a0165f54a8c1c31aed4e2fc4eefcdc36b3a52b6809cd72d1"} Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.463174 4747 scope.go:117] "RemoveContainer" containerID="8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.463308 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kzs26" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.488875 4747 scope.go:117] "RemoveContainer" containerID="8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3" Sep 30 18:50:18 crc kubenswrapper[4747]: E0930 18:50:18.489439 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3\": container with ID starting with 8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3 not found: ID does not exist" containerID="8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.489476 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3"} err="failed to get container status \"8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3\": rpc error: code = NotFound desc = could not find container \"8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3\": container with ID starting with 8cc871a9fec30f88470311d501d23e7094894b425ca2633146f00b879cdb05b3 not found: ID does not exist" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.502796 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kzs26"] Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.506236 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kzs26"] Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.538397 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:18 crc kubenswrapper[4747]: I0930 18:50:18.807539 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-666545c866-g5wls"] Sep 30 18:50:18 crc kubenswrapper[4747]: W0930 18:50:18.821395 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e6a36f_4b46_4e29_aa4d_101eae89c828.slice/crio-ca3ffd1f0d738c8659b10359b41f6c7304e2e59ddfbbdfd0c1200d9d93d41ec1 WatchSource:0}: Error finding container ca3ffd1f0d738c8659b10359b41f6c7304e2e59ddfbbdfd0c1200d9d93d41ec1: Status 404 returned error can't find the container with id ca3ffd1f0d738c8659b10359b41f6c7304e2e59ddfbbdfd0c1200d9d93d41ec1 Sep 30 18:50:19 crc kubenswrapper[4747]: I0930 18:50:19.093269 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e53e2da5-c4a6-42ae-a59b-a064f1c8756b" path="/var/lib/kubelet/pods/e53e2da5-c4a6-42ae-a59b-a064f1c8756b/volumes" Sep 30 18:50:19 crc kubenswrapper[4747]: I0930 18:50:19.472265 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-666545c866-g5wls" event={"ID":"52e6a36f-4b46-4e29-aa4d-101eae89c828","Type":"ContainerStarted","Data":"5943e69a7369fbd6a2c7240032896f2ef8177eb5e5c907288516269b7e53f2ce"} Sep 30 18:50:19 crc kubenswrapper[4747]: I0930 18:50:19.472313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-666545c866-g5wls" event={"ID":"52e6a36f-4b46-4e29-aa4d-101eae89c828","Type":"ContainerStarted","Data":"ca3ffd1f0d738c8659b10359b41f6c7304e2e59ddfbbdfd0c1200d9d93d41ec1"} Sep 30 18:50:19 crc kubenswrapper[4747]: I0930 18:50:19.472609 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:19 crc kubenswrapper[4747]: I0930 18:50:19.477912 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-666545c866-g5wls" Sep 30 18:50:19 crc kubenswrapper[4747]: I0930 18:50:19.528037 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-666545c866-g5wls" podStartSLOduration=27.528007759 podStartE2EDuration="27.528007759s" podCreationTimestamp="2025-09-30 18:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:50:19.501786473 +0000 UTC m=+259.161266637" watchObservedRunningTime="2025-09-30 18:50:19.528007759 +0000 UTC m=+259.187487913" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.192825 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrfvq"] Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.193972 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qrfvq" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerName="registry-server" containerID="cri-o://2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df" gracePeriod=30 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.204796 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84v6x"] Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.205074 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84v6x" podUID="26c98ff0-f864-4301-9423-f037408bce18" containerName="registry-server" containerID="cri-o://104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c" gracePeriod=30 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.208685 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kfbg"] Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.208850 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" podUID="0007394a-7089-407b-ad0f-25c9794ccefe" containerName="marketplace-operator" containerID="cri-o://31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44" gracePeriod=30 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.210800 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vs5"] Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.236522 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vnw6d"] Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.237131 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vnw6d" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="registry-server" containerID="cri-o://8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9" gracePeriod=30 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.247342 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wl22v"] Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.248294 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.258003 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wl22v"] Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.373651 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlkjk\" (UniqueName: \"kubernetes.io/projected/461defee-db4a-4cf5-bb7b-bffeb4bdf244-kube-api-access-hlkjk\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.373745 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/461defee-db4a-4cf5-bb7b-bffeb4bdf244-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.373771 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/461defee-db4a-4cf5-bb7b-bffeb4bdf244-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.474576 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlkjk\" (UniqueName: \"kubernetes.io/projected/461defee-db4a-4cf5-bb7b-bffeb4bdf244-kube-api-access-hlkjk\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.474657 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/461defee-db4a-4cf5-bb7b-bffeb4bdf244-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.474681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/461defee-db4a-4cf5-bb7b-bffeb4bdf244-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.476840 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/461defee-db4a-4cf5-bb7b-bffeb4bdf244-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.485229 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/461defee-db4a-4cf5-bb7b-bffeb4bdf244-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.501390 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlkjk\" (UniqueName: \"kubernetes.io/projected/461defee-db4a-4cf5-bb7b-bffeb4bdf244-kube-api-access-hlkjk\") pod \"marketplace-operator-79b997595-wl22v\" (UID: \"461defee-db4a-4cf5-bb7b-bffeb4bdf244\") " pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.640796 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.647285 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.662516 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.667895 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.672737 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.678620 4747 generic.go:334] "Generic (PLEG): container finished" podID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerID="2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df" exitCode=0 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.678701 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrfvq" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.678704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrfvq" event={"ID":"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042","Type":"ContainerDied","Data":"2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df"} Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.678845 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrfvq" event={"ID":"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042","Type":"ContainerDied","Data":"031e65a485c574b55dbbd88ea40bd10679f14faaa67fdfcb89126f4087338659"} Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.678866 4747 scope.go:117] "RemoveContainer" containerID="2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.681096 4747 generic.go:334] "Generic (PLEG): container finished" podID="26c98ff0-f864-4301-9423-f037408bce18" containerID="104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c" exitCode=0 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.681166 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84v6x" event={"ID":"26c98ff0-f864-4301-9423-f037408bce18","Type":"ContainerDied","Data":"104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c"} Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.681196 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84v6x" event={"ID":"26c98ff0-f864-4301-9423-f037408bce18","Type":"ContainerDied","Data":"a509c1dbfd54bbf384d65805647a5aadc50f970cfd087da96781f9c7e786dc87"} Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.681256 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84v6x" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.682726 4747 generic.go:334] "Generic (PLEG): container finished" podID="0007394a-7089-407b-ad0f-25c9794ccefe" containerID="31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44" exitCode=0 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.682767 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" event={"ID":"0007394a-7089-407b-ad0f-25c9794ccefe","Type":"ContainerDied","Data":"31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44"} Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.682782 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" event={"ID":"0007394a-7089-407b-ad0f-25c9794ccefe","Type":"ContainerDied","Data":"a6cf40e76bc6ff640438c172691b7cc4c3980fe848b9ac721732e01b2ed2dc27"} Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.682822 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8kfbg" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.684960 4747 generic.go:334] "Generic (PLEG): container finished" podID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerID="8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9" exitCode=0 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.685021 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vnw6d" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.685075 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnw6d" event={"ID":"95d8b334-0a57-47a1-bfea-3e30f6527e13","Type":"ContainerDied","Data":"8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9"} Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.685106 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vnw6d" event={"ID":"95d8b334-0a57-47a1-bfea-3e30f6527e13","Type":"ContainerDied","Data":"9e664045c2b6a4b2213b646e8a33cd4f24919f306b6af3d0ee8f8dafe379d9ea"} Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.685225 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2vs5" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerName="registry-server" containerID="cri-o://ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0" gracePeriod=30 Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.705570 4747 scope.go:117] "RemoveContainer" containerID="201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.729003 4747 scope.go:117] "RemoveContainer" containerID="1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780354 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-catalog-content\") pod \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780398 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6hd\" (UniqueName: \"kubernetes.io/projected/0007394a-7089-407b-ad0f-25c9794ccefe-kube-api-access-2q6hd\") pod \"0007394a-7089-407b-ad0f-25c9794ccefe\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780427 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-trusted-ca\") pod \"0007394a-7089-407b-ad0f-25c9794ccefe\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780448 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-catalog-content\") pod \"26c98ff0-f864-4301-9423-f037408bce18\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780475 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmpzl\" (UniqueName: \"kubernetes.io/projected/26c98ff0-f864-4301-9423-f037408bce18-kube-api-access-jmpzl\") pod \"26c98ff0-f864-4301-9423-f037408bce18\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780510 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42m7f\" (UniqueName: \"kubernetes.io/projected/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-kube-api-access-42m7f\") pod \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780525 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-utilities\") pod \"95d8b334-0a57-47a1-bfea-3e30f6527e13\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780550 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qksf6\" (UniqueName: \"kubernetes.io/projected/95d8b334-0a57-47a1-bfea-3e30f6527e13-kube-api-access-qksf6\") pod \"95d8b334-0a57-47a1-bfea-3e30f6527e13\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780580 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-catalog-content\") pod \"95d8b334-0a57-47a1-bfea-3e30f6527e13\" (UID: \"95d8b334-0a57-47a1-bfea-3e30f6527e13\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780603 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-utilities\") pod \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\" (UID: \"8b9fc6cc-7437-4639-9c0a-05e8c2ce1042\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780623 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-operator-metrics\") pod \"0007394a-7089-407b-ad0f-25c9794ccefe\" (UID: \"0007394a-7089-407b-ad0f-25c9794ccefe\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.780651 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-utilities\") pod \"26c98ff0-f864-4301-9423-f037408bce18\" (UID: \"26c98ff0-f864-4301-9423-f037408bce18\") " Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.794032 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0007394a-7089-407b-ad0f-25c9794ccefe-kube-api-access-2q6hd" (OuterVolumeSpecName: "kube-api-access-2q6hd") pod "0007394a-7089-407b-ad0f-25c9794ccefe" (UID: "0007394a-7089-407b-ad0f-25c9794ccefe"). InnerVolumeSpecName "kube-api-access-2q6hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.794629 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-utilities" (OuterVolumeSpecName: "utilities") pod "26c98ff0-f864-4301-9423-f037408bce18" (UID: "26c98ff0-f864-4301-9423-f037408bce18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.802569 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-utilities" (OuterVolumeSpecName: "utilities") pod "8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" (UID: "8b9fc6cc-7437-4639-9c0a-05e8c2ce1042"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.805054 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0007394a-7089-407b-ad0f-25c9794ccefe" (UID: "0007394a-7089-407b-ad0f-25c9794ccefe"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.816634 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d8b334-0a57-47a1-bfea-3e30f6527e13-kube-api-access-qksf6" (OuterVolumeSpecName: "kube-api-access-qksf6") pod "95d8b334-0a57-47a1-bfea-3e30f6527e13" (UID: "95d8b334-0a57-47a1-bfea-3e30f6527e13"). InnerVolumeSpecName "kube-api-access-qksf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.816626 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0007394a-7089-407b-ad0f-25c9794ccefe" (UID: "0007394a-7089-407b-ad0f-25c9794ccefe"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.817342 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-utilities" (OuterVolumeSpecName: "utilities") pod "95d8b334-0a57-47a1-bfea-3e30f6527e13" (UID: "95d8b334-0a57-47a1-bfea-3e30f6527e13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.818889 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c98ff0-f864-4301-9423-f037408bce18-kube-api-access-jmpzl" (OuterVolumeSpecName: "kube-api-access-jmpzl") pod "26c98ff0-f864-4301-9423-f037408bce18" (UID: "26c98ff0-f864-4301-9423-f037408bce18"). InnerVolumeSpecName "kube-api-access-jmpzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.822134 4747 scope.go:117] "RemoveContainer" containerID="2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df" Sep 30 18:50:50 crc kubenswrapper[4747]: E0930 18:50:50.822535 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df\": container with ID starting with 2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df not found: ID does not exist" containerID="2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.822572 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df"} err="failed to get container status \"2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df\": rpc error: code = NotFound desc = could not find container \"2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df\": container with ID starting with 2f729377c8c5ac40ad79d6b7356dba2c4c9d95b15b51d17986523d05ed6624df not found: ID does not exist" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.822599 4747 scope.go:117] "RemoveContainer" containerID="201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1" Sep 30 18:50:50 crc kubenswrapper[4747]: E0930 18:50:50.822864 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1\": container with ID starting with 201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1 not found: ID does not exist" containerID="201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.822893 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1"} err="failed to get container status \"201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1\": rpc error: code = NotFound desc = could not find container \"201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1\": container with ID starting with 201ddbe1e0a3284bc4073fd301422502d9fe9a396a947854626319378ad68dc1 not found: ID does not exist" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.822911 4747 scope.go:117] "RemoveContainer" containerID="1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476" Sep 30 18:50:50 crc kubenswrapper[4747]: E0930 18:50:50.823100 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476\": container with ID starting with 1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476 not found: ID does not exist" containerID="1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.823125 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476"} err="failed to get container status \"1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476\": rpc error: code = NotFound desc = could not find container \"1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476\": container with ID starting with 1cd8c8e8671113c6469144769cc66128ac45ede5851e7ae2864341d78e1b4476 not found: ID does not exist" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.823139 4747 scope.go:117] "RemoveContainer" containerID="104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.828166 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-kube-api-access-42m7f" (OuterVolumeSpecName: "kube-api-access-42m7f") pod "8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" (UID: "8b9fc6cc-7437-4639-9c0a-05e8c2ce1042"). InnerVolumeSpecName "kube-api-access-42m7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.840957 4747 scope.go:117] "RemoveContainer" containerID="1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.865673 4747 scope.go:117] "RemoveContainer" containerID="3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882152 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6hd\" (UniqueName: \"kubernetes.io/projected/0007394a-7089-407b-ad0f-25c9794ccefe-kube-api-access-2q6hd\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882177 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882187 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmpzl\" (UniqueName: \"kubernetes.io/projected/26c98ff0-f864-4301-9423-f037408bce18-kube-api-access-jmpzl\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882197 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42m7f\" (UniqueName: \"kubernetes.io/projected/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-kube-api-access-42m7f\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882206 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882215 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qksf6\" (UniqueName: \"kubernetes.io/projected/95d8b334-0a57-47a1-bfea-3e30f6527e13-kube-api-access-qksf6\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882224 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882232 4747 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0007394a-7089-407b-ad0f-25c9794ccefe-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.882242 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.903653 4747 scope.go:117] "RemoveContainer" containerID="104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c" Sep 30 18:50:50 crc kubenswrapper[4747]: E0930 18:50:50.904331 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c\": container with ID starting with 104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c not found: ID does not exist" containerID="104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.904361 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c"} err="failed to get container status \"104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c\": rpc error: code = NotFound desc = could not find container \"104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c\": container with ID starting with 104a4c6cd9560eb8329445632d3f0468a16b1c0fd442cf71115ea20ed3e15c1c not found: ID does not exist" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.904381 4747 scope.go:117] "RemoveContainer" containerID="1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054" Sep 30 18:50:50 crc kubenswrapper[4747]: E0930 18:50:50.905453 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054\": container with ID starting with 1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054 not found: ID does not exist" containerID="1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.905474 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054"} err="failed to get container status \"1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054\": rpc error: code = NotFound desc = could not find container \"1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054\": container with ID starting with 1cda1bdfd0b3646ae6139cc5ab98bea3a4c2af5d9e0ae4ca2a970f0d06709054 not found: ID does not exist" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.905488 4747 scope.go:117] "RemoveContainer" containerID="3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad" Sep 30 18:50:50 crc kubenswrapper[4747]: E0930 18:50:50.906233 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad\": container with ID starting with 3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad not found: ID does not exist" containerID="3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.906258 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad"} err="failed to get container status \"3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad\": rpc error: code = NotFound desc = could not find container \"3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad\": container with ID starting with 3812861699a7a69facf99f675b3b4609a9564f0615784e0af2f0cc56d501aaad not found: ID does not exist" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.906273 4747 scope.go:117] "RemoveContainer" containerID="31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.916874 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26c98ff0-f864-4301-9423-f037408bce18" (UID: "26c98ff0-f864-4301-9423-f037408bce18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.923396 4747 scope.go:117] "RemoveContainer" containerID="31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44" Sep 30 18:50:50 crc kubenswrapper[4747]: E0930 18:50:50.923784 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44\": container with ID starting with 31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44 not found: ID does not exist" containerID="31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.923822 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44"} err="failed to get container status \"31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44\": rpc error: code = NotFound desc = could not find container \"31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44\": container with ID starting with 31a64b87efaa9f75c622b172ebe1c994b328c3cf8b9d9f9ecc7900a379865f44 not found: ID does not exist" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.923852 4747 scope.go:117] "RemoveContainer" containerID="8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.928866 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" (UID: "8b9fc6cc-7437-4639-9c0a-05e8c2ce1042"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.939059 4747 scope.go:117] "RemoveContainer" containerID="a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.948285 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95d8b334-0a57-47a1-bfea-3e30f6527e13" (UID: "95d8b334-0a57-47a1-bfea-3e30f6527e13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.961231 4747 scope.go:117] "RemoveContainer" containerID="f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.967333 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wl22v"] Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.983489 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.983640 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c98ff0-f864-4301-9423-f037408bce18-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:50 crc kubenswrapper[4747]: I0930 18:50:50.983670 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d8b334-0a57-47a1-bfea-3e30f6527e13-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.008372 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84v6x"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.012787 4747 scope.go:117] "RemoveContainer" containerID="8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9" Sep 30 18:50:51 crc kubenswrapper[4747]: E0930 18:50:51.014172 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9\": container with ID starting with 8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9 not found: ID does not exist" containerID="8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.014232 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9"} err="failed to get container status \"8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9\": rpc error: code = NotFound desc = could not find container \"8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9\": container with ID starting with 8bc4b24bb15e590c18024f5ed159c6aab8f798276c69d675efef9d88c7ad1ba9 not found: ID does not exist" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.014270 4747 scope.go:117] "RemoveContainer" containerID="a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad" Sep 30 18:50:51 crc kubenswrapper[4747]: E0930 18:50:51.015497 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad\": container with ID starting with a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad not found: ID does not exist" containerID="a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.015545 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad"} err="failed to get container status \"a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad\": rpc error: code = NotFound desc = could not find container \"a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad\": container with ID starting with a5f254f6c521d69dae142d6f4148cbc878c66acf4cfc7f54dd753e7ff0ef0aad not found: ID does not exist" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.015575 4747 scope.go:117] "RemoveContainer" containerID="f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee" Sep 30 18:50:51 crc kubenswrapper[4747]: E0930 18:50:51.015961 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee\": container with ID starting with f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee not found: ID does not exist" containerID="f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.015993 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee"} err="failed to get container status \"f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee\": rpc error: code = NotFound desc = could not find container \"f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee\": container with ID starting with f91fc52cbd2dd72b43f034734cc1da1a469c1cbb347eb67d6f243348bc4ed4ee not found: ID does not exist" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.018578 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84v6x"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.027180 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vnw6d"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.033082 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vnw6d"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.035785 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.053097 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kfbg"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.059217 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8kfbg"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.081968 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrfvq"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.097537 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0007394a-7089-407b-ad0f-25c9794ccefe" path="/var/lib/kubelet/pods/0007394a-7089-407b-ad0f-25c9794ccefe/volumes" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.098794 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c98ff0-f864-4301-9423-f037408bce18" path="/var/lib/kubelet/pods/26c98ff0-f864-4301-9423-f037408bce18/volumes" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.100618 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" path="/var/lib/kubelet/pods/95d8b334-0a57-47a1-bfea-3e30f6527e13/volumes" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.105984 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qrfvq"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.186466 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m66z5\" (UniqueName: \"kubernetes.io/projected/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-kube-api-access-m66z5\") pod \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.187709 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-catalog-content\") pod \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.187790 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-utilities\") pod \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\" (UID: \"dabd124e-9bb4-49f6-86e2-1ab79d3e314e\") " Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.189566 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-utilities" (OuterVolumeSpecName: "utilities") pod "dabd124e-9bb4-49f6-86e2-1ab79d3e314e" (UID: "dabd124e-9bb4-49f6-86e2-1ab79d3e314e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.191491 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-kube-api-access-m66z5" (OuterVolumeSpecName: "kube-api-access-m66z5") pod "dabd124e-9bb4-49f6-86e2-1ab79d3e314e" (UID: "dabd124e-9bb4-49f6-86e2-1ab79d3e314e"). InnerVolumeSpecName "kube-api-access-m66z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.216382 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dabd124e-9bb4-49f6-86e2-1ab79d3e314e" (UID: "dabd124e-9bb4-49f6-86e2-1ab79d3e314e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.289451 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.289488 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.289502 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m66z5\" (UniqueName: \"kubernetes.io/projected/dabd124e-9bb4-49f6-86e2-1ab79d3e314e-kube-api-access-m66z5\") on node \"crc\" DevicePath \"\"" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.695248 4747 generic.go:334] "Generic (PLEG): container finished" podID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerID="ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0" exitCode=0 Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.695365 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vs5" event={"ID":"dabd124e-9bb4-49f6-86e2-1ab79d3e314e","Type":"ContainerDied","Data":"ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0"} Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.695397 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2vs5" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.695478 4747 scope.go:117] "RemoveContainer" containerID="ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.695457 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2vs5" event={"ID":"dabd124e-9bb4-49f6-86e2-1ab79d3e314e","Type":"ContainerDied","Data":"6028f4ce71ce2d850e8dcc2882969f505add34a5d692ca50ec93dcaf42b8c872"} Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.701559 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" event={"ID":"461defee-db4a-4cf5-bb7b-bffeb4bdf244","Type":"ContainerStarted","Data":"b6e65f64cb6d006b9f36d4986fe3707b389e8107f8af984c8354678a975ba863"} Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.701611 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" event={"ID":"461defee-db4a-4cf5-bb7b-bffeb4bdf244","Type":"ContainerStarted","Data":"0092abab4f23e2ce0b1ec2bf1bb73340c375a965be7996aa3e1aa3d98c475cba"} Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.703550 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.710564 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.726362 4747 scope.go:117] "RemoveContainer" containerID="b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.727343 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wl22v" podStartSLOduration=1.727309615 podStartE2EDuration="1.727309615s" podCreationTimestamp="2025-09-30 18:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:50:51.726979894 +0000 UTC m=+291.386460048" watchObservedRunningTime="2025-09-30 18:50:51.727309615 +0000 UTC m=+291.386789739" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.749435 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vs5"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.761447 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2vs5"] Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.776683 4747 scope.go:117] "RemoveContainer" containerID="0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.805257 4747 scope.go:117] "RemoveContainer" containerID="ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0" Sep 30 18:50:51 crc kubenswrapper[4747]: E0930 18:50:51.806358 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0\": container with ID starting with ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0 not found: ID does not exist" containerID="ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.806391 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0"} err="failed to get container status \"ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0\": rpc error: code = NotFound desc = could not find container \"ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0\": container with ID starting with ef8f4d279d1be6104e6ddbf1e2e77ec7a369015d7daa97b36ec02ff2201818a0 not found: ID does not exist" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.806416 4747 scope.go:117] "RemoveContainer" containerID="b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed" Sep 30 18:50:51 crc kubenswrapper[4747]: E0930 18:50:51.806745 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed\": container with ID starting with b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed not found: ID does not exist" containerID="b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.806766 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed"} err="failed to get container status \"b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed\": rpc error: code = NotFound desc = could not find container \"b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed\": container with ID starting with b7555a2d6eb3b07c4f37c336646d894dde311f8b23f42838861e2075254e3aed not found: ID does not exist" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.806780 4747 scope.go:117] "RemoveContainer" containerID="0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae" Sep 30 18:50:51 crc kubenswrapper[4747]: E0930 18:50:51.807208 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae\": container with ID starting with 0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae not found: ID does not exist" containerID="0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae" Sep 30 18:50:51 crc kubenswrapper[4747]: I0930 18:50:51.807227 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae"} err="failed to get container status \"0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae\": rpc error: code = NotFound desc = could not find container \"0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae\": container with ID starting with 0ff4ff98cd731b53edadf913c311382f0fa9a1ab16d245480f4c4ace8ca9d2ae not found: ID does not exist" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.418231 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qj7z8"] Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.418786 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="extract-content" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.418810 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="extract-content" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.418827 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerName="extract-utilities" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.418840 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerName="extract-utilities" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.418857 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.418873 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.418887 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="extract-utilities" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.418899 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="extract-utilities" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.418918 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c98ff0-f864-4301-9423-f037408bce18" containerName="extract-utilities" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.418958 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c98ff0-f864-4301-9423-f037408bce18" containerName="extract-utilities" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.418977 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0007394a-7089-407b-ad0f-25c9794ccefe" containerName="marketplace-operator" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.418989 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="0007394a-7089-407b-ad0f-25c9794ccefe" containerName="marketplace-operator" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.419006 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerName="extract-content" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419018 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerName="extract-content" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.419033 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419046 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.419069 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerName="extract-utilities" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419081 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerName="extract-utilities" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.419099 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419111 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.419129 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerName="extract-content" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419141 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerName="extract-content" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.419160 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c98ff0-f864-4301-9423-f037408bce18" containerName="extract-content" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419172 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c98ff0-f864-4301-9423-f037408bce18" containerName="extract-content" Sep 30 18:50:52 crc kubenswrapper[4747]: E0930 18:50:52.419184 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c98ff0-f864-4301-9423-f037408bce18" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419196 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c98ff0-f864-4301-9423-f037408bce18" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419385 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419420 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419446 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d8b334-0a57-47a1-bfea-3e30f6527e13" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419469 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="0007394a-7089-407b-ad0f-25c9794ccefe" containerName="marketplace-operator" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.419487 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c98ff0-f864-4301-9423-f037408bce18" containerName="registry-server" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.421107 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.427014 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.431350 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qj7z8"] Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.508006 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgslr\" (UniqueName: \"kubernetes.io/projected/f33b4813-db02-4f07-9662-58046af264f2-kube-api-access-vgslr\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.508074 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f33b4813-db02-4f07-9662-58046af264f2-catalog-content\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.508105 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f33b4813-db02-4f07-9662-58046af264f2-utilities\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.608988 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgslr\" (UniqueName: \"kubernetes.io/projected/f33b4813-db02-4f07-9662-58046af264f2-kube-api-access-vgslr\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.609043 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f33b4813-db02-4f07-9662-58046af264f2-catalog-content\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.609071 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f33b4813-db02-4f07-9662-58046af264f2-utilities\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.609147 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9bc55"] Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.609679 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f33b4813-db02-4f07-9662-58046af264f2-utilities\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.609744 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f33b4813-db02-4f07-9662-58046af264f2-catalog-content\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.610353 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.617934 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.624289 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9bc55"] Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.645860 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgslr\" (UniqueName: \"kubernetes.io/projected/f33b4813-db02-4f07-9662-58046af264f2-kube-api-access-vgslr\") pod \"redhat-marketplace-qj7z8\" (UID: \"f33b4813-db02-4f07-9662-58046af264f2\") " pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.710847 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-utilities\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.711032 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-catalog-content\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.711107 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlrz9\" (UniqueName: \"kubernetes.io/projected/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-kube-api-access-qlrz9\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.750537 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.812433 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-catalog-content\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.812531 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlrz9\" (UniqueName: \"kubernetes.io/projected/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-kube-api-access-qlrz9\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.812554 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-utilities\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.813036 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-utilities\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.813019 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-catalog-content\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.830266 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlrz9\" (UniqueName: \"kubernetes.io/projected/9b4f7959-bc18-4ea1-b257-4a1a1aa17395-kube-api-access-qlrz9\") pod \"redhat-operators-9bc55\" (UID: \"9b4f7959-bc18-4ea1-b257-4a1a1aa17395\") " pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.950212 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qj7z8"] Sep 30 18:50:52 crc kubenswrapper[4747]: I0930 18:50:52.995587 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.096224 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9fc6cc-7437-4639-9c0a-05e8c2ce1042" path="/var/lib/kubelet/pods/8b9fc6cc-7437-4639-9c0a-05e8c2ce1042/volumes" Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.096974 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabd124e-9bb4-49f6-86e2-1ab79d3e314e" path="/var/lib/kubelet/pods/dabd124e-9bb4-49f6-86e2-1ab79d3e314e/volumes" Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.195744 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9bc55"] Sep 30 18:50:53 crc kubenswrapper[4747]: W0930 18:50:53.204050 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4f7959_bc18_4ea1_b257_4a1a1aa17395.slice/crio-e293d32231f215113cb4bde75ba1e51d5583f44c9c47a44295803f8ac71e276a WatchSource:0}: Error finding container e293d32231f215113cb4bde75ba1e51d5583f44c9c47a44295803f8ac71e276a: Status 404 returned error can't find the container with id e293d32231f215113cb4bde75ba1e51d5583f44c9c47a44295803f8ac71e276a Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.736234 4747 generic.go:334] "Generic (PLEG): container finished" podID="9b4f7959-bc18-4ea1-b257-4a1a1aa17395" containerID="bced562b1e380a4753cfa649583efcb9ed76fd138773ff4dd06d4fc9db78d2a8" exitCode=0 Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.736378 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bc55" event={"ID":"9b4f7959-bc18-4ea1-b257-4a1a1aa17395","Type":"ContainerDied","Data":"bced562b1e380a4753cfa649583efcb9ed76fd138773ff4dd06d4fc9db78d2a8"} Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.736850 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bc55" event={"ID":"9b4f7959-bc18-4ea1-b257-4a1a1aa17395","Type":"ContainerStarted","Data":"e293d32231f215113cb4bde75ba1e51d5583f44c9c47a44295803f8ac71e276a"} Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.744633 4747 generic.go:334] "Generic (PLEG): container finished" podID="f33b4813-db02-4f07-9662-58046af264f2" containerID="f923c74c87cedc693d2c452a5570ad6acf24b2c5229f8c9027b11c7605c2abf6" exitCode=0 Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.745147 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qj7z8" event={"ID":"f33b4813-db02-4f07-9662-58046af264f2","Type":"ContainerDied","Data":"f923c74c87cedc693d2c452a5570ad6acf24b2c5229f8c9027b11c7605c2abf6"} Sep 30 18:50:53 crc kubenswrapper[4747]: I0930 18:50:53.745260 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qj7z8" event={"ID":"f33b4813-db02-4f07-9662-58046af264f2","Type":"ContainerStarted","Data":"d6d72bcb1385e1a27b06863ae794cd5102cd3f17aaaf5024b575bbcbf18c6694"} Sep 30 18:50:54 crc kubenswrapper[4747]: I0930 18:50:54.756742 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qj7z8" event={"ID":"f33b4813-db02-4f07-9662-58046af264f2","Type":"ContainerStarted","Data":"a12813c9d11d38a9247a426386ad74bb738f4aef560ac4e6b11fe603559991b0"} Sep 30 18:50:54 crc kubenswrapper[4747]: I0930 18:50:54.811637 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s9zdv"] Sep 30 18:50:54 crc kubenswrapper[4747]: I0930 18:50:54.813702 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:54 crc kubenswrapper[4747]: I0930 18:50:54.817154 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 18:50:54 crc kubenswrapper[4747]: I0930 18:50:54.824453 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s9zdv"] Sep 30 18:50:54 crc kubenswrapper[4747]: I0930 18:50:54.939959 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4395a0b-1f58-4e40-a352-db7ad2b5e688-catalog-content\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:54 crc kubenswrapper[4747]: I0930 18:50:54.940103 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xcp\" (UniqueName: \"kubernetes.io/projected/d4395a0b-1f58-4e40-a352-db7ad2b5e688-kube-api-access-s5xcp\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:54 crc kubenswrapper[4747]: I0930 18:50:54.940157 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4395a0b-1f58-4e40-a352-db7ad2b5e688-utilities\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.016241 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w49q7"] Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.017614 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.019590 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.031450 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w49q7"] Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.041120 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xcp\" (UniqueName: \"kubernetes.io/projected/d4395a0b-1f58-4e40-a352-db7ad2b5e688-kube-api-access-s5xcp\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.041470 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4395a0b-1f58-4e40-a352-db7ad2b5e688-utilities\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.041943 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4395a0b-1f58-4e40-a352-db7ad2b5e688-utilities\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.042074 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4395a0b-1f58-4e40-a352-db7ad2b5e688-catalog-content\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.042334 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4395a0b-1f58-4e40-a352-db7ad2b5e688-catalog-content\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.073860 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xcp\" (UniqueName: \"kubernetes.io/projected/d4395a0b-1f58-4e40-a352-db7ad2b5e688-kube-api-access-s5xcp\") pod \"community-operators-s9zdv\" (UID: \"d4395a0b-1f58-4e40-a352-db7ad2b5e688\") " pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.143696 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-utilities\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.143970 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-catalog-content\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.144049 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4twnh\" (UniqueName: \"kubernetes.io/projected/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-kube-api-access-4twnh\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.205324 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.244991 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-catalog-content\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.245047 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4twnh\" (UniqueName: \"kubernetes.io/projected/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-kube-api-access-4twnh\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.245097 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-utilities\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.245535 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-catalog-content\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.245647 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-utilities\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.269044 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4twnh\" (UniqueName: \"kubernetes.io/projected/e7c5dfa5-42d2-4344-bfb3-bd0781f392c5-kube-api-access-4twnh\") pod \"certified-operators-w49q7\" (UID: \"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5\") " pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.345456 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.427231 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s9zdv"] Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.557046 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w49q7"] Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.764725 4747 generic.go:334] "Generic (PLEG): container finished" podID="9b4f7959-bc18-4ea1-b257-4a1a1aa17395" containerID="7e0003c6cde2c9e94aabb0358240ac10f95a1075a2be758395432b64f7ddb2d5" exitCode=0 Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.764865 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bc55" event={"ID":"9b4f7959-bc18-4ea1-b257-4a1a1aa17395","Type":"ContainerDied","Data":"7e0003c6cde2c9e94aabb0358240ac10f95a1075a2be758395432b64f7ddb2d5"} Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.766239 4747 generic.go:334] "Generic (PLEG): container finished" podID="d4395a0b-1f58-4e40-a352-db7ad2b5e688" containerID="57ef751c4425b4528ee10fa1de6bd19afff4e544524e013e8fb2cea390540c3d" exitCode=0 Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.766287 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zdv" event={"ID":"d4395a0b-1f58-4e40-a352-db7ad2b5e688","Type":"ContainerDied","Data":"57ef751c4425b4528ee10fa1de6bd19afff4e544524e013e8fb2cea390540c3d"} Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.766308 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zdv" event={"ID":"d4395a0b-1f58-4e40-a352-db7ad2b5e688","Type":"ContainerStarted","Data":"fd81dea3327418a3be6d5c999d4e2d3303eea26ee4def43c34b85671c404cc77"} Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.776351 4747 generic.go:334] "Generic (PLEG): container finished" podID="f33b4813-db02-4f07-9662-58046af264f2" containerID="a12813c9d11d38a9247a426386ad74bb738f4aef560ac4e6b11fe603559991b0" exitCode=0 Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.776425 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qj7z8" event={"ID":"f33b4813-db02-4f07-9662-58046af264f2","Type":"ContainerDied","Data":"a12813c9d11d38a9247a426386ad74bb738f4aef560ac4e6b11fe603559991b0"} Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.779578 4747 generic.go:334] "Generic (PLEG): container finished" podID="e7c5dfa5-42d2-4344-bfb3-bd0781f392c5" containerID="5bf6426e5e46ada1d4e19350a420e675381c8b1c7097c88b11cca49618b5bbdf" exitCode=0 Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.779616 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w49q7" event={"ID":"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5","Type":"ContainerDied","Data":"5bf6426e5e46ada1d4e19350a420e675381c8b1c7097c88b11cca49618b5bbdf"} Sep 30 18:50:55 crc kubenswrapper[4747]: I0930 18:50:55.779642 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w49q7" event={"ID":"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5","Type":"ContainerStarted","Data":"e8b99bf2511b79721b331f2691988305671b9238b9a5c11d98e61a7af6102b6c"} Sep 30 18:50:56 crc kubenswrapper[4747]: I0930 18:50:56.797919 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qj7z8" event={"ID":"f33b4813-db02-4f07-9662-58046af264f2","Type":"ContainerStarted","Data":"7ac45e5cf0008ef35b054e6ec191efcbff2f9647b24b7d821a488d1c754a2618"} Sep 30 18:50:56 crc kubenswrapper[4747]: I0930 18:50:56.810899 4747 generic.go:334] "Generic (PLEG): container finished" podID="e7c5dfa5-42d2-4344-bfb3-bd0781f392c5" containerID="879b6d6db6bc49029fae2d2433ac98aac779f765a021a79502aaf5fea5d7844a" exitCode=0 Sep 30 18:50:56 crc kubenswrapper[4747]: I0930 18:50:56.811000 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w49q7" event={"ID":"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5","Type":"ContainerDied","Data":"879b6d6db6bc49029fae2d2433ac98aac779f765a021a79502aaf5fea5d7844a"} Sep 30 18:50:56 crc kubenswrapper[4747]: I0930 18:50:56.817154 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bc55" event={"ID":"9b4f7959-bc18-4ea1-b257-4a1a1aa17395","Type":"ContainerStarted","Data":"2f29f882f274a68dbf9c997a62547397d943e403fcaf0d49a2249eb3c7e97210"} Sep 30 18:50:56 crc kubenswrapper[4747]: I0930 18:50:56.821496 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qj7z8" podStartSLOduration=2.359650621 podStartE2EDuration="4.821482568s" podCreationTimestamp="2025-09-30 18:50:52 +0000 UTC" firstStartedPulling="2025-09-30 18:50:53.746893482 +0000 UTC m=+293.406373596" lastFinishedPulling="2025-09-30 18:50:56.208725429 +0000 UTC m=+295.868205543" observedRunningTime="2025-09-30 18:50:56.821309542 +0000 UTC m=+296.480789686" watchObservedRunningTime="2025-09-30 18:50:56.821482568 +0000 UTC m=+296.480962682" Sep 30 18:50:56 crc kubenswrapper[4747]: I0930 18:50:56.846664 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9bc55" podStartSLOduration=2.348205772 podStartE2EDuration="4.846643106s" podCreationTimestamp="2025-09-30 18:50:52 +0000 UTC" firstStartedPulling="2025-09-30 18:50:53.738074148 +0000 UTC m=+293.397554302" lastFinishedPulling="2025-09-30 18:50:56.236511522 +0000 UTC m=+295.895991636" observedRunningTime="2025-09-30 18:50:56.843261787 +0000 UTC m=+296.502741911" watchObservedRunningTime="2025-09-30 18:50:56.846643106 +0000 UTC m=+296.506123230" Sep 30 18:50:57 crc kubenswrapper[4747]: I0930 18:50:57.827015 4747 generic.go:334] "Generic (PLEG): container finished" podID="d4395a0b-1f58-4e40-a352-db7ad2b5e688" containerID="50f78e67fb26333c6a8dbc885f4832933ff3b666b39c06490667be04dd59f140" exitCode=0 Sep 30 18:50:57 crc kubenswrapper[4747]: I0930 18:50:57.827099 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zdv" event={"ID":"d4395a0b-1f58-4e40-a352-db7ad2b5e688","Type":"ContainerDied","Data":"50f78e67fb26333c6a8dbc885f4832933ff3b666b39c06490667be04dd59f140"} Sep 30 18:50:57 crc kubenswrapper[4747]: I0930 18:50:57.830091 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w49q7" event={"ID":"e7c5dfa5-42d2-4344-bfb3-bd0781f392c5","Type":"ContainerStarted","Data":"a18f2e757dd82230b3b94ae99745c9f4cbee5f2652998e6a660da73d1e69279e"} Sep 30 18:50:57 crc kubenswrapper[4747]: I0930 18:50:57.872250 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w49q7" podStartSLOduration=2.35945764 podStartE2EDuration="3.872229941s" podCreationTimestamp="2025-09-30 18:50:54 +0000 UTC" firstStartedPulling="2025-09-30 18:50:55.780684313 +0000 UTC m=+295.440164427" lastFinishedPulling="2025-09-30 18:50:57.293456614 +0000 UTC m=+296.952936728" observedRunningTime="2025-09-30 18:50:57.868586734 +0000 UTC m=+297.528066838" watchObservedRunningTime="2025-09-30 18:50:57.872229941 +0000 UTC m=+297.531710055" Sep 30 18:50:59 crc kubenswrapper[4747]: I0930 18:50:59.857636 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zdv" event={"ID":"d4395a0b-1f58-4e40-a352-db7ad2b5e688","Type":"ContainerStarted","Data":"abbc3d1b447b6c826db7552e575b647122cca0a3197c08c1a23ff85b0669564f"} Sep 30 18:50:59 crc kubenswrapper[4747]: I0930 18:50:59.884836 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s9zdv" podStartSLOduration=3.346793386 podStartE2EDuration="5.884815972s" podCreationTimestamp="2025-09-30 18:50:54 +0000 UTC" firstStartedPulling="2025-09-30 18:50:55.773067589 +0000 UTC m=+295.432547703" lastFinishedPulling="2025-09-30 18:50:58.311090145 +0000 UTC m=+297.970570289" observedRunningTime="2025-09-30 18:50:59.880816614 +0000 UTC m=+299.540296738" watchObservedRunningTime="2025-09-30 18:50:59.884815972 +0000 UTC m=+299.544296096" Sep 30 18:51:02 crc kubenswrapper[4747]: I0930 18:51:02.751851 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:51:02 crc kubenswrapper[4747]: I0930 18:51:02.752826 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:51:02 crc kubenswrapper[4747]: I0930 18:51:02.816614 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:51:02 crc kubenswrapper[4747]: I0930 18:51:02.920179 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qj7z8" Sep 30 18:51:02 crc kubenswrapper[4747]: I0930 18:51:02.997230 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:51:02 crc kubenswrapper[4747]: I0930 18:51:02.997347 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:51:03 crc kubenswrapper[4747]: I0930 18:51:03.035965 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:51:03 crc kubenswrapper[4747]: I0930 18:51:03.956858 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9bc55" Sep 30 18:51:05 crc kubenswrapper[4747]: I0930 18:51:05.206141 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:51:05 crc kubenswrapper[4747]: I0930 18:51:05.206202 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:51:05 crc kubenswrapper[4747]: I0930 18:51:05.272892 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:51:05 crc kubenswrapper[4747]: I0930 18:51:05.345849 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:51:05 crc kubenswrapper[4747]: I0930 18:51:05.346121 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:51:05 crc kubenswrapper[4747]: I0930 18:51:05.397546 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:51:05 crc kubenswrapper[4747]: I0930 18:51:05.951883 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s9zdv" Sep 30 18:51:05 crc kubenswrapper[4747]: I0930 18:51:05.967425 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w49q7" Sep 30 18:51:37 crc kubenswrapper[4747]: I0930 18:51:37.656120 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:51:37 crc kubenswrapper[4747]: I0930 18:51:37.656847 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:52:07 crc kubenswrapper[4747]: I0930 18:52:07.655566 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:52:07 crc kubenswrapper[4747]: I0930 18:52:07.658182 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:52:37 crc kubenswrapper[4747]: I0930 18:52:37.656402 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:52:37 crc kubenswrapper[4747]: I0930 18:52:37.657203 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:52:37 crc kubenswrapper[4747]: I0930 18:52:37.657294 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:52:37 crc kubenswrapper[4747]: I0930 18:52:37.658519 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58baf4587e29954a036ee1a983f6ca3f3c55174a0a0fbc62706e1e634f34fbef"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:52:37 crc kubenswrapper[4747]: I0930 18:52:37.658631 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://58baf4587e29954a036ee1a983f6ca3f3c55174a0a0fbc62706e1e634f34fbef" gracePeriod=600 Sep 30 18:52:38 crc kubenswrapper[4747]: I0930 18:52:38.566832 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="58baf4587e29954a036ee1a983f6ca3f3c55174a0a0fbc62706e1e634f34fbef" exitCode=0 Sep 30 18:52:38 crc kubenswrapper[4747]: I0930 18:52:38.566986 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"58baf4587e29954a036ee1a983f6ca3f3c55174a0a0fbc62706e1e634f34fbef"} Sep 30 18:52:38 crc kubenswrapper[4747]: I0930 18:52:38.567883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"d543f08c59a323444d6e9001d7802b512e4a86e59b1b1b4efef8d96bd15c0e26"} Sep 30 18:52:38 crc kubenswrapper[4747]: I0930 18:52:38.567915 4747 scope.go:117] "RemoveContainer" containerID="e359df8de3d76f059593bbcfcb16181e8dc4f5a3dac39d48dcf30cab3d54e1e9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.787797 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pxxf9"] Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.789053 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.812674 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pxxf9"] Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.952401 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tthq\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-kube-api-access-8tthq\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.952448 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5df9d969-0c65-461b-ad11-9bb2136fba05-registry-certificates\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.952485 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-registry-tls\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.952597 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5df9d969-0c65-461b-ad11-9bb2136fba05-trusted-ca\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.952627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.952652 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5df9d969-0c65-461b-ad11-9bb2136fba05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.952669 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-bound-sa-token\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.952688 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5df9d969-0c65-461b-ad11-9bb2136fba05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:54 crc kubenswrapper[4747]: I0930 18:52:54.979514 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.054609 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5df9d969-0c65-461b-ad11-9bb2136fba05-trusted-ca\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.055128 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5df9d969-0c65-461b-ad11-9bb2136fba05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.055177 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-bound-sa-token\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.055253 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5df9d969-0c65-461b-ad11-9bb2136fba05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.055340 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tthq\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-kube-api-access-8tthq\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.055395 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5df9d969-0c65-461b-ad11-9bb2136fba05-registry-certificates\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.055450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-registry-tls\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.056195 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5df9d969-0c65-461b-ad11-9bb2136fba05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.056870 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5df9d969-0c65-461b-ad11-9bb2136fba05-registry-certificates\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.057545 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5df9d969-0c65-461b-ad11-9bb2136fba05-trusted-ca\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.062148 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-registry-tls\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.068348 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5df9d969-0c65-461b-ad11-9bb2136fba05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.072146 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-bound-sa-token\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.082315 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tthq\" (UniqueName: \"kubernetes.io/projected/5df9d969-0c65-461b-ad11-9bb2136fba05-kube-api-access-8tthq\") pod \"image-registry-66df7c8f76-pxxf9\" (UID: \"5df9d969-0c65-461b-ad11-9bb2136fba05\") " pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.103395 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.322018 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pxxf9"] Sep 30 18:52:55 crc kubenswrapper[4747]: W0930 18:52:55.328774 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df9d969_0c65_461b_ad11_9bb2136fba05.slice/crio-30b8844df57ddfb07b2d6364039c4a0965740cd3fd877d12a6dd289f9a3c3858 WatchSource:0}: Error finding container 30b8844df57ddfb07b2d6364039c4a0965740cd3fd877d12a6dd289f9a3c3858: Status 404 returned error can't find the container with id 30b8844df57ddfb07b2d6364039c4a0965740cd3fd877d12a6dd289f9a3c3858 Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.699048 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" event={"ID":"5df9d969-0c65-461b-ad11-9bb2136fba05","Type":"ContainerStarted","Data":"08336eb1f7013127a18530d0234af0fc8409f9737d572c8310c9f5bf961c6068"} Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.699112 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" event={"ID":"5df9d969-0c65-461b-ad11-9bb2136fba05","Type":"ContainerStarted","Data":"30b8844df57ddfb07b2d6364039c4a0965740cd3fd877d12a6dd289f9a3c3858"} Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.699240 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:52:55 crc kubenswrapper[4747]: I0930 18:52:55.731290 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" podStartSLOduration=1.7312710610000002 podStartE2EDuration="1.731271061s" podCreationTimestamp="2025-09-30 18:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:52:55.725735628 +0000 UTC m=+415.385215772" watchObservedRunningTime="2025-09-30 18:52:55.731271061 +0000 UTC m=+415.390751215" Sep 30 18:53:15 crc kubenswrapper[4747]: I0930 18:53:15.115336 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-pxxf9" Sep 30 18:53:15 crc kubenswrapper[4747]: I0930 18:53:15.197698 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h88d"] Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.287100 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" podUID="07103b35-ea08-4d06-b981-d04736a21d17" containerName="registry" containerID="cri-o://8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18" gracePeriod=30 Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.677811 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.845340 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07103b35-ea08-4d06-b981-d04736a21d17-ca-trust-extracted\") pod \"07103b35-ea08-4d06-b981-d04736a21d17\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.845700 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-bound-sa-token\") pod \"07103b35-ea08-4d06-b981-d04736a21d17\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.845787 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-registry-certificates\") pod \"07103b35-ea08-4d06-b981-d04736a21d17\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.845857 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-trusted-ca\") pod \"07103b35-ea08-4d06-b981-d04736a21d17\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.845887 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-registry-tls\") pod \"07103b35-ea08-4d06-b981-d04736a21d17\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.846148 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"07103b35-ea08-4d06-b981-d04736a21d17\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.846209 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07103b35-ea08-4d06-b981-d04736a21d17-installation-pull-secrets\") pod \"07103b35-ea08-4d06-b981-d04736a21d17\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.846315 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v8f2\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-kube-api-access-6v8f2\") pod \"07103b35-ea08-4d06-b981-d04736a21d17\" (UID: \"07103b35-ea08-4d06-b981-d04736a21d17\") " Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.848441 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "07103b35-ea08-4d06-b981-d04736a21d17" (UID: "07103b35-ea08-4d06-b981-d04736a21d17"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.848765 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "07103b35-ea08-4d06-b981-d04736a21d17" (UID: "07103b35-ea08-4d06-b981-d04736a21d17"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.856315 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "07103b35-ea08-4d06-b981-d04736a21d17" (UID: "07103b35-ea08-4d06-b981-d04736a21d17"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.856722 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07103b35-ea08-4d06-b981-d04736a21d17-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "07103b35-ea08-4d06-b981-d04736a21d17" (UID: "07103b35-ea08-4d06-b981-d04736a21d17"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.857009 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "07103b35-ea08-4d06-b981-d04736a21d17" (UID: "07103b35-ea08-4d06-b981-d04736a21d17"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.859333 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-kube-api-access-6v8f2" (OuterVolumeSpecName: "kube-api-access-6v8f2") pod "07103b35-ea08-4d06-b981-d04736a21d17" (UID: "07103b35-ea08-4d06-b981-d04736a21d17"). InnerVolumeSpecName "kube-api-access-6v8f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.865714 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "07103b35-ea08-4d06-b981-d04736a21d17" (UID: "07103b35-ea08-4d06-b981-d04736a21d17"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.883445 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07103b35-ea08-4d06-b981-d04736a21d17-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "07103b35-ea08-4d06-b981-d04736a21d17" (UID: "07103b35-ea08-4d06-b981-d04736a21d17"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.947047 4747 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.947078 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07103b35-ea08-4d06-b981-d04736a21d17-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.947088 4747 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.947098 4747 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07103b35-ea08-4d06-b981-d04736a21d17-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.947107 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v8f2\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-kube-api-access-6v8f2\") on node \"crc\" DevicePath \"\"" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.947114 4747 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07103b35-ea08-4d06-b981-d04736a21d17-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 18:53:40 crc kubenswrapper[4747]: I0930 18:53:40.947124 4747 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07103b35-ea08-4d06-b981-d04736a21d17-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.008404 4747 generic.go:334] "Generic (PLEG): container finished" podID="07103b35-ea08-4d06-b981-d04736a21d17" containerID="8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18" exitCode=0 Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.008482 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" event={"ID":"07103b35-ea08-4d06-b981-d04736a21d17","Type":"ContainerDied","Data":"8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18"} Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.008518 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" event={"ID":"07103b35-ea08-4d06-b981-d04736a21d17","Type":"ContainerDied","Data":"464f3ea0b76001a5a5ad3bc489a05c05f611dcd2c4699c9aff4fec9e7f0fedae"} Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.008541 4747 scope.go:117] "RemoveContainer" containerID="8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18" Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.008549 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.028244 4747 scope.go:117] "RemoveContainer" containerID="8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18" Sep 30 18:53:41 crc kubenswrapper[4747]: E0930 18:53:41.028785 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18\": container with ID starting with 8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18 not found: ID does not exist" containerID="8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18" Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.028828 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18"} err="failed to get container status \"8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18\": rpc error: code = NotFound desc = could not find container \"8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18\": container with ID starting with 8166a65d940f64d431bc834fdd3126b1129f4108bbf9040baae7a06118fbae18 not found: ID does not exist" Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.066872 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h88d"] Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.075000 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h88d"] Sep 30 18:53:41 crc kubenswrapper[4747]: I0930 18:53:41.098707 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07103b35-ea08-4d06-b981-d04736a21d17" path="/var/lib/kubelet/pods/07103b35-ea08-4d06-b981-d04736a21d17/volumes" Sep 30 18:53:45 crc kubenswrapper[4747]: I0930 18:53:45.610192 4747 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-6h88d container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.16:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 18:53:45 crc kubenswrapper[4747]: I0930 18:53:45.610765 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-6h88d" podUID="07103b35-ea08-4d06-b981-d04736a21d17" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.16:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 18:54:37 crc kubenswrapper[4747]: I0930 18:54:37.656247 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:54:37 crc kubenswrapper[4747]: I0930 18:54:37.657147 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:55:07 crc kubenswrapper[4747]: I0930 18:55:07.655434 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:55:07 crc kubenswrapper[4747]: I0930 18:55:07.656386 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:55:37 crc kubenswrapper[4747]: I0930 18:55:37.655671 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:55:37 crc kubenswrapper[4747]: I0930 18:55:37.656516 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:55:37 crc kubenswrapper[4747]: I0930 18:55:37.656591 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:55:37 crc kubenswrapper[4747]: I0930 18:55:37.657488 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d543f08c59a323444d6e9001d7802b512e4a86e59b1b1b4efef8d96bd15c0e26"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:55:37 crc kubenswrapper[4747]: I0930 18:55:37.657587 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://d543f08c59a323444d6e9001d7802b512e4a86e59b1b1b4efef8d96bd15c0e26" gracePeriod=600 Sep 30 18:55:37 crc kubenswrapper[4747]: I0930 18:55:37.804660 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="d543f08c59a323444d6e9001d7802b512e4a86e59b1b1b4efef8d96bd15c0e26" exitCode=0 Sep 30 18:55:37 crc kubenswrapper[4747]: I0930 18:55:37.805030 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"d543f08c59a323444d6e9001d7802b512e4a86e59b1b1b4efef8d96bd15c0e26"} Sep 30 18:55:37 crc kubenswrapper[4747]: I0930 18:55:37.805074 4747 scope.go:117] "RemoveContainer" containerID="58baf4587e29954a036ee1a983f6ca3f3c55174a0a0fbc62706e1e634f34fbef" Sep 30 18:55:38 crc kubenswrapper[4747]: I0930 18:55:38.816080 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"f3b9f45b84cc1eae815bcdc0ad8efb2eb78da9ac4324427d149fbbf26250b353"} Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.215668 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pnqjs"] Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.217438 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovn-controller" containerID="cri-o://2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914" gracePeriod=30 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.217898 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="nbdb" containerID="cri-o://6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" gracePeriod=30 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.218221 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kube-rbac-proxy-node" containerID="cri-o://641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8" gracePeriod=30 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.218421 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovn-acl-logging" containerID="cri-o://e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23" gracePeriod=30 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.217425 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="sbdb" containerID="cri-o://0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" gracePeriod=30 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.218500 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560" gracePeriod=30 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.218476 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="northd" containerID="cri-o://929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493" gracePeriod=30 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.283267 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" containerID="cri-o://820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1" gracePeriod=30 Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.515329 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91 is running failed: container process not found" containerID="0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.515429 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d is running failed: container process not found" containerID="6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.515649 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91 is running failed: container process not found" containerID="0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.515844 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d is running failed: container process not found" containerID="6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.515937 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91 is running failed: container process not found" containerID="0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.515970 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="sbdb" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.516319 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d is running failed: container process not found" containerID="6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.516378 4747 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="nbdb" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.529188 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/2.log" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.530072 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/1.log" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.530145 4747 generic.go:334] "Generic (PLEG): container finished" podID="34f8698b-7682-4b27-99d0-d72fff30d5a8" containerID="aa3692f348ec4e24682c7900affd59c3ddc6c6a3de5a1e5a2f45a754c971356d" exitCode=2 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.530241 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zjq4" event={"ID":"34f8698b-7682-4b27-99d0-d72fff30d5a8","Type":"ContainerDied","Data":"aa3692f348ec4e24682c7900affd59c3ddc6c6a3de5a1e5a2f45a754c971356d"} Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.530318 4747 scope.go:117] "RemoveContainer" containerID="f5e461ab27da42ffaec705181407093f7fc3daa646ea03f81051b0b512149a33" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.530967 4747 scope.go:117] "RemoveContainer" containerID="aa3692f348ec4e24682c7900affd59c3ddc6c6a3de5a1e5a2f45a754c971356d" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.531260 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4zjq4_openshift-multus(34f8698b-7682-4b27-99d0-d72fff30d5a8)\"" pod="openshift-multus/multus-4zjq4" podUID="34f8698b-7682-4b27-99d0-d72fff30d5a8" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.535704 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/3.log" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.545859 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovn-acl-logging/0.log" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.546593 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovn-controller/0.log" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547127 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1" exitCode=0 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547161 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" exitCode=0 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547175 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560" exitCode=0 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547188 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8" exitCode=0 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547201 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23" exitCode=143 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547217 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914" exitCode=143 Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547234 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1"} Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547295 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d"} Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547320 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560"} Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547340 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8"} Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547364 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23"} Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.547383 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914"} Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.596802 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovnkube-controller/3.log" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.599006 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovn-acl-logging/0.log" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.599380 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovn-controller/0.log" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.599794 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.601074 4747 scope.go:117] "RemoveContainer" containerID="4ba5e3c666eea6037ec48ad926a9b7d0171c0bd9e7e163018ce78c70276d42d5" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.665048 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2j2mw"] Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.665644 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="nbdb" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.665802 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="nbdb" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.665998 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.666124 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.666247 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.666361 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.666484 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.666617 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.666739 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovn-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.666848 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovn-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.666989 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kubecfg-setup" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.667095 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kubecfg-setup" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.667224 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="sbdb" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.667341 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="sbdb" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.667467 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="northd" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.667600 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="northd" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.667723 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovn-acl-logging" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.667831 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovn-acl-logging" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.667994 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kube-rbac-proxy-node" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.668121 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kube-rbac-proxy-node" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.668239 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.668349 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.668470 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07103b35-ea08-4d06-b981-d04736a21d17" containerName="registry" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.668604 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="07103b35-ea08-4d06-b981-d04736a21d17" containerName="registry" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.668903 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="northd" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.669077 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.669201 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.669312 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.669423 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="07103b35-ea08-4d06-b981-d04736a21d17" containerName="registry" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.669566 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.669681 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="nbdb" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.669793 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="sbdb" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.669958 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovn-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.670079 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="kube-rbac-proxy-node" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.670207 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovn-acl-logging" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.670323 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.670639 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.670745 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: E0930 18:57:23.670814 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.670883 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.670936 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.670842 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-openvswitch\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671169 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-script-lib\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671200 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bsls\" (UniqueName: \"kubernetes.io/projected/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-kube-api-access-9bsls\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671221 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-var-lib-openvswitch\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671241 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-slash\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671272 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-etc-openvswitch\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671300 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-systemd\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671323 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-systemd-units\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671342 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-bin\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671367 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-kubelet\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671386 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-ovn\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671410 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-node-log\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671432 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671428 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671452 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671482 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671515 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671487 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671539 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671508 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-node-log" (OuterVolumeSpecName: "node-log") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671453 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-netd\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671562 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671628 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-slash" (OuterVolumeSpecName: "host-slash") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671643 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671655 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-env-overrides\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671830 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-netns\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671879 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovn-node-metrics-cert\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671962 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.671991 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-ovn-kubernetes\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672137 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672181 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-config\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672278 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-log-socket\") pod \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\" (UID: \"5851f3a5-36f6-4e85-8584-5ce70fda9d7d\") " Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672291 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672363 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-log-socket" (OuterVolumeSpecName: "log-socket") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672419 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672586 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerName="ovnkube-controller" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672849 4747 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672889 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672909 4747 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672841 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672952 4747 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.672999 4747 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673013 4747 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673027 4747 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673040 4747 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673052 4747 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673064 4747 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673075 4747 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673089 4747 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673103 4747 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673115 4747 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673127 4747 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.673138 4747 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.675325 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.678489 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.681097 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-kube-api-access-9bsls" (OuterVolumeSpecName: "kube-api-access-9bsls") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "kube-api-access-9bsls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.691946 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5851f3a5-36f6-4e85-8584-5ce70fda9d7d" (UID: "5851f3a5-36f6-4e85-8584-5ce70fda9d7d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774258 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovnkube-config\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774330 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-run-ovn-kubernetes\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774371 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-ovn\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774413 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774453 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774487 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-cni-bin\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774516 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-node-log\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774547 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovn-node-metrics-cert\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774580 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-run-netns\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774622 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxcns\" (UniqueName: \"kubernetes.io/projected/fee965e6-4c96-4ec4-aa99-48f06c97afb2-kube-api-access-cxcns\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774654 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-systemd-units\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774695 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-log-socket\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774735 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-env-overrides\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774764 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-systemd\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774795 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovnkube-script-lib\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774832 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-kubelet\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774864 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-cni-netd\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774894 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-slash\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774954 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-var-lib-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.774985 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-etc-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.775048 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bsls\" (UniqueName: \"kubernetes.io/projected/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-kube-api-access-9bsls\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.775070 4747 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.775091 4747 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.775109 4747 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5851f3a5-36f6-4e85-8584-5ce70fda9d7d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876290 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-log-socket\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876383 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-systemd\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876421 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-env-overrides\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876459 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovnkube-script-lib\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876521 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-kubelet\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876597 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-systemd\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876609 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-cni-netd\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876718 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-kubelet\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876774 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-slash\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876737 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-slash\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876853 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-var-lib-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876891 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-etc-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876965 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovnkube-config\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877009 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-run-ovn-kubernetes\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877061 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-ovn\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877118 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877154 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877206 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-cni-bin\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877238 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-node-log\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877272 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovn-node-metrics-cert\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877311 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-run-netns\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877374 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxcns\" (UniqueName: \"kubernetes.io/projected/fee965e6-4c96-4ec4-aa99-48f06c97afb2-kube-api-access-cxcns\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877408 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-systemd-units\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877501 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-env-overrides\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877536 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-systemd-units\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876519 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-log-socket\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877605 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877606 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877646 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-var-lib-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.876674 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-cni-netd\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877705 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-node-log\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877745 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-etc-openvswitch\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.878034 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovnkube-script-lib\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.878127 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-run-ovn\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.878187 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-run-ovn-kubernetes\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.878239 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-run-netns\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.877669 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fee965e6-4c96-4ec4-aa99-48f06c97afb2-host-cni-bin\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.878635 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovnkube-config\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.883637 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fee965e6-4c96-4ec4-aa99-48f06c97afb2-ovn-node-metrics-cert\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.907541 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxcns\" (UniqueName: \"kubernetes.io/projected/fee965e6-4c96-4ec4-aa99-48f06c97afb2-kube-api-access-cxcns\") pod \"ovnkube-node-2j2mw\" (UID: \"fee965e6-4c96-4ec4-aa99-48f06c97afb2\") " pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:23 crc kubenswrapper[4747]: I0930 18:57:23.993362 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.555743 4747 generic.go:334] "Generic (PLEG): container finished" podID="fee965e6-4c96-4ec4-aa99-48f06c97afb2" containerID="6e71be11d4adfc40fcf5eb5c945fd07b6330a1985b61b1b7f4476a17fad640dc" exitCode=0 Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.555872 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerDied","Data":"6e71be11d4adfc40fcf5eb5c945fd07b6330a1985b61b1b7f4476a17fad640dc"} Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.556368 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"b24f9fb7e966fc62919ac73845213dfa22695bd6c07d793987f53fbfe43219c3"} Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.567623 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovn-acl-logging/0.log" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.568420 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pnqjs_5851f3a5-36f6-4e85-8584-5ce70fda9d7d/ovn-controller/0.log" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.568987 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" exitCode=0 Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.569017 4747 generic.go:334] "Generic (PLEG): container finished" podID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" containerID="929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493" exitCode=0 Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.569098 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91"} Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.569136 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493"} Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.569160 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" event={"ID":"5851f3a5-36f6-4e85-8584-5ce70fda9d7d","Type":"ContainerDied","Data":"1b1a9c722e051ed0fa5548647209c10e8877d9b139e5458e769b6a9e75e96baf"} Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.569188 4747 scope.go:117] "RemoveContainer" containerID="820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.569367 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pnqjs" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.577096 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/2.log" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.623017 4747 scope.go:117] "RemoveContainer" containerID="0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.673850 4747 scope.go:117] "RemoveContainer" containerID="6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.694807 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pnqjs"] Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.705164 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pnqjs"] Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.731171 4747 scope.go:117] "RemoveContainer" containerID="929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.755334 4747 scope.go:117] "RemoveContainer" containerID="5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.773706 4747 scope.go:117] "RemoveContainer" containerID="641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.791315 4747 scope.go:117] "RemoveContainer" containerID="e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.805570 4747 scope.go:117] "RemoveContainer" containerID="2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.830888 4747 scope.go:117] "RemoveContainer" containerID="1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.856605 4747 scope.go:117] "RemoveContainer" containerID="820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.856943 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1\": container with ID starting with 820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1 not found: ID does not exist" containerID="820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.856984 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1"} err="failed to get container status \"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1\": rpc error: code = NotFound desc = could not find container \"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1\": container with ID starting with 820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.857010 4747 scope.go:117] "RemoveContainer" containerID="0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.857654 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\": container with ID starting with 0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91 not found: ID does not exist" containerID="0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.857729 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91"} err="failed to get container status \"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\": rpc error: code = NotFound desc = could not find container \"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\": container with ID starting with 0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.857786 4747 scope.go:117] "RemoveContainer" containerID="6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.858232 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\": container with ID starting with 6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d not found: ID does not exist" containerID="6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.858263 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d"} err="failed to get container status \"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\": rpc error: code = NotFound desc = could not find container \"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\": container with ID starting with 6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.858281 4747 scope.go:117] "RemoveContainer" containerID="929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.859046 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\": container with ID starting with 929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493 not found: ID does not exist" containerID="929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.859084 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493"} err="failed to get container status \"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\": rpc error: code = NotFound desc = could not find container \"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\": container with ID starting with 929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.859106 4747 scope.go:117] "RemoveContainer" containerID="5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.859504 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\": container with ID starting with 5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560 not found: ID does not exist" containerID="5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.859558 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560"} err="failed to get container status \"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\": rpc error: code = NotFound desc = could not find container \"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\": container with ID starting with 5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.859598 4747 scope.go:117] "RemoveContainer" containerID="641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.860041 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\": container with ID starting with 641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8 not found: ID does not exist" containerID="641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.860076 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8"} err="failed to get container status \"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\": rpc error: code = NotFound desc = could not find container \"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\": container with ID starting with 641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.860104 4747 scope.go:117] "RemoveContainer" containerID="e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.860464 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\": container with ID starting with e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23 not found: ID does not exist" containerID="e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.860506 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23"} err="failed to get container status \"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\": rpc error: code = NotFound desc = could not find container \"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\": container with ID starting with e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.860587 4747 scope.go:117] "RemoveContainer" containerID="2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.861333 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\": container with ID starting with 2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914 not found: ID does not exist" containerID="2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.861383 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914"} err="failed to get container status \"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\": rpc error: code = NotFound desc = could not find container \"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\": container with ID starting with 2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.861402 4747 scope.go:117] "RemoveContainer" containerID="1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3" Sep 30 18:57:24 crc kubenswrapper[4747]: E0930 18:57:24.861999 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\": container with ID starting with 1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3 not found: ID does not exist" containerID="1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.862026 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3"} err="failed to get container status \"1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\": rpc error: code = NotFound desc = could not find container \"1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\": container with ID starting with 1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.862043 4747 scope.go:117] "RemoveContainer" containerID="820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.862442 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1"} err="failed to get container status \"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1\": rpc error: code = NotFound desc = could not find container \"820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1\": container with ID starting with 820514938bc61fe615132bcc8e91cb085435630842b354dd14fb7799c6942ed1 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.862501 4747 scope.go:117] "RemoveContainer" containerID="0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.862804 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91"} err="failed to get container status \"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\": rpc error: code = NotFound desc = could not find container \"0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91\": container with ID starting with 0062dcf6ad472cb809d3f2976924fee249bc0bde3d23db1449ebc39c3cf1df91 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.862840 4747 scope.go:117] "RemoveContainer" containerID="6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.863384 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d"} err="failed to get container status \"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\": rpc error: code = NotFound desc = could not find container \"6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d\": container with ID starting with 6189514ca2d8d0f033cd396750342ec10d4f62837396cb7429a80c88a6138b6d not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.863431 4747 scope.go:117] "RemoveContainer" containerID="929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.863832 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493"} err="failed to get container status \"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\": rpc error: code = NotFound desc = could not find container \"929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493\": container with ID starting with 929e4ec4b61f6602ad8bf51cc0c1358378b12b9b8276ebbba44e916407b87493 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.863863 4747 scope.go:117] "RemoveContainer" containerID="5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.864201 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560"} err="failed to get container status \"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\": rpc error: code = NotFound desc = could not find container \"5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560\": container with ID starting with 5f21e9085c7d4b27074ae8fc089884fd67c9bc06b5fcefde35d61c57a6ec1560 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.864249 4747 scope.go:117] "RemoveContainer" containerID="641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.864529 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8"} err="failed to get container status \"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\": rpc error: code = NotFound desc = could not find container \"641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8\": container with ID starting with 641beb8775d9c34df9c63a7e5110fc2152db575198ac4ac194529b04333486b8 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.864562 4747 scope.go:117] "RemoveContainer" containerID="e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.865165 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23"} err="failed to get container status \"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\": rpc error: code = NotFound desc = could not find container \"e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23\": container with ID starting with e6ca8cd9f90c0afeaf22376e421fc75fa9a2433f431bb9b02bf0f8451d8fec23 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.865192 4747 scope.go:117] "RemoveContainer" containerID="2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.865521 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914"} err="failed to get container status \"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\": rpc error: code = NotFound desc = could not find container \"2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914\": container with ID starting with 2b48ff79adddae97ec212af67bc6a2f15fc3b0a064576b77bd3b8c2f2db8b914 not found: ID does not exist" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.865562 4747 scope.go:117] "RemoveContainer" containerID="1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3" Sep 30 18:57:24 crc kubenswrapper[4747]: I0930 18:57:24.866080 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3"} err="failed to get container status \"1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\": rpc error: code = NotFound desc = could not find container \"1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3\": container with ID starting with 1c6676755da34c2bf1800b37e631d3235d1329702d058a490083bcb49d376fd3 not found: ID does not exist" Sep 30 18:57:25 crc kubenswrapper[4747]: I0930 18:57:25.100691 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5851f3a5-36f6-4e85-8584-5ce70fda9d7d" path="/var/lib/kubelet/pods/5851f3a5-36f6-4e85-8584-5ce70fda9d7d/volumes" Sep 30 18:57:25 crc kubenswrapper[4747]: I0930 18:57:25.588361 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"f5f84497c9a0a078ad88e739e01196e5dfc1ebb3ccd930f6f5356c5ca6908f5f"} Sep 30 18:57:25 crc kubenswrapper[4747]: I0930 18:57:25.588705 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"3592c58fcabf811b863eb604d4ce597b5cd86de26de8e1d8e8693d4ae54f7504"} Sep 30 18:57:25 crc kubenswrapper[4747]: I0930 18:57:25.588720 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"e75d27157e81d5e30cb8d855c42d704b884cf9f5073dd4d62ee847c02c37244d"} Sep 30 18:57:25 crc kubenswrapper[4747]: I0930 18:57:25.588733 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"2b66ec750da7f875fa6ab085db4e96e9442fc7d48c7cb709e7d4e750a079a45e"} Sep 30 18:57:25 crc kubenswrapper[4747]: I0930 18:57:25.588745 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"c9372d44c7bfe0c4906256d371633b53ab4ff7af9687b0818a8f44fe1fc4c473"} Sep 30 18:57:25 crc kubenswrapper[4747]: I0930 18:57:25.588759 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"407ed2b67fc7403d1ea61d6483ddc88fb7f18a0b92eea26d3a510756548e3c47"} Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.179443 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ws48w"] Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.181284 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.184409 4747 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-8qcdh" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.186133 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.187895 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.190198 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.236835 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-crc-storage\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.236907 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-node-mnt\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.237002 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfvw\" (UniqueName: \"kubernetes.io/projected/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-kube-api-access-whfvw\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.338261 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-crc-storage\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.338340 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-node-mnt\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.338418 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whfvw\" (UniqueName: \"kubernetes.io/projected/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-kube-api-access-whfvw\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.340340 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-crc-storage\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.340649 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-node-mnt\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.374289 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whfvw\" (UniqueName: \"kubernetes.io/projected/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-kube-api-access-whfvw\") pod \"crc-storage-crc-ws48w\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: I0930 18:57:27.508012 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: E0930 18:57:27.551703 4747 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(23b7b7556c8b5145594477cfb68e54108338838134a2a3752e67aa524d270fd3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 18:57:27 crc kubenswrapper[4747]: E0930 18:57:27.551808 4747 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(23b7b7556c8b5145594477cfb68e54108338838134a2a3752e67aa524d270fd3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: E0930 18:57:27.551854 4747 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(23b7b7556c8b5145594477cfb68e54108338838134a2a3752e67aa524d270fd3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:27 crc kubenswrapper[4747]: E0930 18:57:27.551965 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ws48w_crc-storage(6fc34c99-fd5a-4646-9b60-2dee2279a6d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ws48w_crc-storage(6fc34c99-fd5a-4646-9b60-2dee2279a6d4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(23b7b7556c8b5145594477cfb68e54108338838134a2a3752e67aa524d270fd3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ws48w" podUID="6fc34c99-fd5a-4646-9b60-2dee2279a6d4" Sep 30 18:57:28 crc kubenswrapper[4747]: I0930 18:57:28.612461 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"2e9c523560a358b6be16047940494b0b3f7ed4289d019423b68efb2aa800eb92"} Sep 30 18:57:30 crc kubenswrapper[4747]: I0930 18:57:30.628296 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" event={"ID":"fee965e6-4c96-4ec4-aa99-48f06c97afb2","Type":"ContainerStarted","Data":"9e931dc57b978f1d5bf91e87276160532583642254d44425e53f6d2903568d89"} Sep 30 18:57:30 crc kubenswrapper[4747]: I0930 18:57:30.628761 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:30 crc kubenswrapper[4747]: I0930 18:57:30.628787 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:30 crc kubenswrapper[4747]: I0930 18:57:30.674948 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:30 crc kubenswrapper[4747]: I0930 18:57:30.683912 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" podStartSLOduration=7.683696698 podStartE2EDuration="7.683696698s" podCreationTimestamp="2025-09-30 18:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:57:30.681471574 +0000 UTC m=+690.340951738" watchObservedRunningTime="2025-09-30 18:57:30.683696698 +0000 UTC m=+690.343176812" Sep 30 18:57:31 crc kubenswrapper[4747]: I0930 18:57:31.002980 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ws48w"] Sep 30 18:57:31 crc kubenswrapper[4747]: I0930 18:57:31.003592 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:31 crc kubenswrapper[4747]: I0930 18:57:31.004422 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:31 crc kubenswrapper[4747]: E0930 18:57:31.028539 4747 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(af87a9432a68e9f9b1faa215bcc77c09fdcc679023a1e53f15af987e48227676): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 18:57:31 crc kubenswrapper[4747]: E0930 18:57:31.028733 4747 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(af87a9432a68e9f9b1faa215bcc77c09fdcc679023a1e53f15af987e48227676): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:31 crc kubenswrapper[4747]: E0930 18:57:31.028876 4747 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(af87a9432a68e9f9b1faa215bcc77c09fdcc679023a1e53f15af987e48227676): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:31 crc kubenswrapper[4747]: E0930 18:57:31.029107 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ws48w_crc-storage(6fc34c99-fd5a-4646-9b60-2dee2279a6d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ws48w_crc-storage(6fc34c99-fd5a-4646-9b60-2dee2279a6d4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(af87a9432a68e9f9b1faa215bcc77c09fdcc679023a1e53f15af987e48227676): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ws48w" podUID="6fc34c99-fd5a-4646-9b60-2dee2279a6d4" Sep 30 18:57:31 crc kubenswrapper[4747]: I0930 18:57:31.634689 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:31 crc kubenswrapper[4747]: I0930 18:57:31.674217 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:37 crc kubenswrapper[4747]: I0930 18:57:37.655827 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:57:37 crc kubenswrapper[4747]: I0930 18:57:37.656773 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:57:38 crc kubenswrapper[4747]: I0930 18:57:38.088111 4747 scope.go:117] "RemoveContainer" containerID="aa3692f348ec4e24682c7900affd59c3ddc6c6a3de5a1e5a2f45a754c971356d" Sep 30 18:57:38 crc kubenswrapper[4747]: E0930 18:57:38.088443 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4zjq4_openshift-multus(34f8698b-7682-4b27-99d0-d72fff30d5a8)\"" pod="openshift-multus/multus-4zjq4" podUID="34f8698b-7682-4b27-99d0-d72fff30d5a8" Sep 30 18:57:41 crc kubenswrapper[4747]: I0930 18:57:41.087143 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:41 crc kubenswrapper[4747]: I0930 18:57:41.090095 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:41 crc kubenswrapper[4747]: E0930 18:57:41.138022 4747 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(f2311a7961ab54ab0c713a6d9782029e4e1aa844194bbaefc819597d6c8ca244): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 18:57:41 crc kubenswrapper[4747]: E0930 18:57:41.138760 4747 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(f2311a7961ab54ab0c713a6d9782029e4e1aa844194bbaefc819597d6c8ca244): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:41 crc kubenswrapper[4747]: E0930 18:57:41.138804 4747 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(f2311a7961ab54ab0c713a6d9782029e4e1aa844194bbaefc819597d6c8ca244): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:41 crc kubenswrapper[4747]: E0930 18:57:41.138889 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ws48w_crc-storage(6fc34c99-fd5a-4646-9b60-2dee2279a6d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ws48w_crc-storage(6fc34c99-fd5a-4646-9b60-2dee2279a6d4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ws48w_crc-storage_6fc34c99-fd5a-4646-9b60-2dee2279a6d4_0(f2311a7961ab54ab0c713a6d9782029e4e1aa844194bbaefc819597d6c8ca244): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ws48w" podUID="6fc34c99-fd5a-4646-9b60-2dee2279a6d4" Sep 30 18:57:50 crc kubenswrapper[4747]: I0930 18:57:50.087488 4747 scope.go:117] "RemoveContainer" containerID="aa3692f348ec4e24682c7900affd59c3ddc6c6a3de5a1e5a2f45a754c971356d" Sep 30 18:57:50 crc kubenswrapper[4747]: I0930 18:57:50.774522 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4zjq4_34f8698b-7682-4b27-99d0-d72fff30d5a8/kube-multus/2.log" Sep 30 18:57:50 crc kubenswrapper[4747]: I0930 18:57:50.774995 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4zjq4" event={"ID":"34f8698b-7682-4b27-99d0-d72fff30d5a8","Type":"ContainerStarted","Data":"a408eb2fb16be83b0b31ec18c3ff0849f92ece5770313c0400472a8b8f5b3490"} Sep 30 18:57:54 crc kubenswrapper[4747]: I0930 18:57:54.032550 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2j2mw" Sep 30 18:57:55 crc kubenswrapper[4747]: I0930 18:57:55.087096 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:55 crc kubenswrapper[4747]: I0930 18:57:55.088018 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:55 crc kubenswrapper[4747]: I0930 18:57:55.372436 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ws48w"] Sep 30 18:57:55 crc kubenswrapper[4747]: I0930 18:57:55.382238 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:57:55 crc kubenswrapper[4747]: I0930 18:57:55.811452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ws48w" event={"ID":"6fc34c99-fd5a-4646-9b60-2dee2279a6d4","Type":"ContainerStarted","Data":"9ed51475b1c4ee8fdc0547464e510d07b6dd66ffc05abefdc9fc3fec3700a7b8"} Sep 30 18:57:57 crc kubenswrapper[4747]: I0930 18:57:57.826406 4747 generic.go:334] "Generic (PLEG): container finished" podID="6fc34c99-fd5a-4646-9b60-2dee2279a6d4" containerID="a1221574e4311bbcfd8b0878e60abfb4d67cb57e233780708e1d61001202c309" exitCode=0 Sep 30 18:57:57 crc kubenswrapper[4747]: I0930 18:57:57.826532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ws48w" event={"ID":"6fc34c99-fd5a-4646-9b60-2dee2279a6d4","Type":"ContainerDied","Data":"a1221574e4311bbcfd8b0878e60abfb4d67cb57e233780708e1d61001202c309"} Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.118753 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.206807 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-crc-storage\") pod \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.206857 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whfvw\" (UniqueName: \"kubernetes.io/projected/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-kube-api-access-whfvw\") pod \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.206883 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-node-mnt\") pod \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\" (UID: \"6fc34c99-fd5a-4646-9b60-2dee2279a6d4\") " Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.207236 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6fc34c99-fd5a-4646-9b60-2dee2279a6d4" (UID: "6fc34c99-fd5a-4646-9b60-2dee2279a6d4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.218385 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-kube-api-access-whfvw" (OuterVolumeSpecName: "kube-api-access-whfvw") pod "6fc34c99-fd5a-4646-9b60-2dee2279a6d4" (UID: "6fc34c99-fd5a-4646-9b60-2dee2279a6d4"). InnerVolumeSpecName "kube-api-access-whfvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.235091 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6fc34c99-fd5a-4646-9b60-2dee2279a6d4" (UID: "6fc34c99-fd5a-4646-9b60-2dee2279a6d4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.307785 4747 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-crc-storage\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.307818 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whfvw\" (UniqueName: \"kubernetes.io/projected/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-kube-api-access-whfvw\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.307831 4747 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6fc34c99-fd5a-4646-9b60-2dee2279a6d4-node-mnt\") on node \"crc\" DevicePath \"\"" Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.844352 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ws48w" event={"ID":"6fc34c99-fd5a-4646-9b60-2dee2279a6d4","Type":"ContainerDied","Data":"9ed51475b1c4ee8fdc0547464e510d07b6dd66ffc05abefdc9fc3fec3700a7b8"} Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.844411 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed51475b1c4ee8fdc0547464e510d07b6dd66ffc05abefdc9fc3fec3700a7b8" Sep 30 18:57:59 crc kubenswrapper[4747]: I0930 18:57:59.844679 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ws48w" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.100614 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6"] Sep 30 18:58:06 crc kubenswrapper[4747]: E0930 18:58:06.101364 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc34c99-fd5a-4646-9b60-2dee2279a6d4" containerName="storage" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.101381 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc34c99-fd5a-4646-9b60-2dee2279a6d4" containerName="storage" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.101520 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc34c99-fd5a-4646-9b60-2dee2279a6d4" containerName="storage" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.102231 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.105208 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.111550 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6"] Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.228582 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.228732 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jdk\" (UniqueName: \"kubernetes.io/projected/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-kube-api-access-t2jdk\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.228805 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.330064 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.330167 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jdk\" (UniqueName: \"kubernetes.io/projected/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-kube-api-access-t2jdk\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.330245 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.330874 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.331118 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.369814 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jdk\" (UniqueName: \"kubernetes.io/projected/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-kube-api-access-t2jdk\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.425661 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.676267 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6"] Sep 30 18:58:06 crc kubenswrapper[4747]: W0930 18:58:06.682596 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923fff81_6af4_4c7f_b77a_2d7f0d4a557c.slice/crio-2ceb1db65af4bfb3d3e56ff1c7b9f3cd37245632a1c639f5ce81eb3fe9713cf1 WatchSource:0}: Error finding container 2ceb1db65af4bfb3d3e56ff1c7b9f3cd37245632a1c639f5ce81eb3fe9713cf1: Status 404 returned error can't find the container with id 2ceb1db65af4bfb3d3e56ff1c7b9f3cd37245632a1c639f5ce81eb3fe9713cf1 Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.897213 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" event={"ID":"923fff81-6af4-4c7f-b77a-2d7f0d4a557c","Type":"ContainerStarted","Data":"9ff528d74fe07d267e615488598ee5b14164d394496971a01e5d0e70098c0d3e"} Sep 30 18:58:06 crc kubenswrapper[4747]: I0930 18:58:06.897302 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" event={"ID":"923fff81-6af4-4c7f-b77a-2d7f0d4a557c","Type":"ContainerStarted","Data":"2ceb1db65af4bfb3d3e56ff1c7b9f3cd37245632a1c639f5ce81eb3fe9713cf1"} Sep 30 18:58:07 crc kubenswrapper[4747]: I0930 18:58:07.655867 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:58:07 crc kubenswrapper[4747]: I0930 18:58:07.656497 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:58:07 crc kubenswrapper[4747]: I0930 18:58:07.908891 4747 generic.go:334] "Generic (PLEG): container finished" podID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerID="9ff528d74fe07d267e615488598ee5b14164d394496971a01e5d0e70098c0d3e" exitCode=0 Sep 30 18:58:07 crc kubenswrapper[4747]: I0930 18:58:07.908987 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" event={"ID":"923fff81-6af4-4c7f-b77a-2d7f0d4a557c","Type":"ContainerDied","Data":"9ff528d74fe07d267e615488598ee5b14164d394496971a01e5d0e70098c0d3e"} Sep 30 18:58:09 crc kubenswrapper[4747]: I0930 18:58:09.926472 4747 generic.go:334] "Generic (PLEG): container finished" podID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerID="38e02b66aef334601fbcdcea0d0d03c4c1f330e46fd310dbf37feb3a141ceb5a" exitCode=0 Sep 30 18:58:09 crc kubenswrapper[4747]: I0930 18:58:09.926548 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" event={"ID":"923fff81-6af4-4c7f-b77a-2d7f0d4a557c","Type":"ContainerDied","Data":"38e02b66aef334601fbcdcea0d0d03c4c1f330e46fd310dbf37feb3a141ceb5a"} Sep 30 18:58:10 crc kubenswrapper[4747]: I0930 18:58:10.937593 4747 generic.go:334] "Generic (PLEG): container finished" podID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerID="bc4c5cd8f6cea22a87b3c7ef610d149ce11f5539d200cd5f1ce1eaf4320fc253" exitCode=0 Sep 30 18:58:10 crc kubenswrapper[4747]: I0930 18:58:10.937662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" event={"ID":"923fff81-6af4-4c7f-b77a-2d7f0d4a557c","Type":"ContainerDied","Data":"bc4c5cd8f6cea22a87b3c7ef610d149ce11f5539d200cd5f1ce1eaf4320fc253"} Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.321546 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.357893 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2jdk\" (UniqueName: \"kubernetes.io/projected/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-kube-api-access-t2jdk\") pod \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.358083 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-bundle\") pod \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.359262 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-util\") pod \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\" (UID: \"923fff81-6af4-4c7f-b77a-2d7f0d4a557c\") " Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.359784 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-bundle" (OuterVolumeSpecName: "bundle") pod "923fff81-6af4-4c7f-b77a-2d7f0d4a557c" (UID: "923fff81-6af4-4c7f-b77a-2d7f0d4a557c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.371317 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-kube-api-access-t2jdk" (OuterVolumeSpecName: "kube-api-access-t2jdk") pod "923fff81-6af4-4c7f-b77a-2d7f0d4a557c" (UID: "923fff81-6af4-4c7f-b77a-2d7f0d4a557c"). InnerVolumeSpecName "kube-api-access-t2jdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.385662 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-util" (OuterVolumeSpecName: "util") pod "923fff81-6af4-4c7f-b77a-2d7f0d4a557c" (UID: "923fff81-6af4-4c7f-b77a-2d7f0d4a557c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.461517 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-util\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.461694 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2jdk\" (UniqueName: \"kubernetes.io/projected/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-kube-api-access-t2jdk\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.461727 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/923fff81-6af4-4c7f-b77a-2d7f0d4a557c-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.955401 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" event={"ID":"923fff81-6af4-4c7f-b77a-2d7f0d4a557c","Type":"ContainerDied","Data":"2ceb1db65af4bfb3d3e56ff1c7b9f3cd37245632a1c639f5ce81eb3fe9713cf1"} Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.955468 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ceb1db65af4bfb3d3e56ff1c7b9f3cd37245632a1c639f5ce81eb3fe9713cf1" Sep 30 18:58:12 crc kubenswrapper[4747]: I0930 18:58:12.955502 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.690389 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96"] Sep 30 18:58:17 crc kubenswrapper[4747]: E0930 18:58:17.691055 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerName="extract" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.691077 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerName="extract" Sep 30 18:58:17 crc kubenswrapper[4747]: E0930 18:58:17.691106 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerName="pull" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.691119 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerName="pull" Sep 30 18:58:17 crc kubenswrapper[4747]: E0930 18:58:17.691140 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerName="util" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.691154 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerName="util" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.691332 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="923fff81-6af4-4c7f-b77a-2d7f0d4a557c" containerName="extract" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.691909 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.695596 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.702285 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.702999 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qj586" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.708423 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96"] Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.738898 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvkdp\" (UniqueName: \"kubernetes.io/projected/c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07-kube-api-access-kvkdp\") pod \"nmstate-operator-5d6f6cfd66-m7n96\" (UID: \"c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.841138 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvkdp\" (UniqueName: \"kubernetes.io/projected/c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07-kube-api-access-kvkdp\") pod \"nmstate-operator-5d6f6cfd66-m7n96\" (UID: \"c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96" Sep 30 18:58:17 crc kubenswrapper[4747]: I0930 18:58:17.868146 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvkdp\" (UniqueName: \"kubernetes.io/projected/c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07-kube-api-access-kvkdp\") pod \"nmstate-operator-5d6f6cfd66-m7n96\" (UID: \"c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96" Sep 30 18:58:18 crc kubenswrapper[4747]: I0930 18:58:18.017541 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96" Sep 30 18:58:18 crc kubenswrapper[4747]: I0930 18:58:18.385865 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96"] Sep 30 18:58:18 crc kubenswrapper[4747]: I0930 18:58:18.996464 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96" event={"ID":"c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07","Type":"ContainerStarted","Data":"d355a3908f1183112037839ba9a733f82a0690db3d7f5cf064b6c96b5dc4b33b"} Sep 30 18:58:22 crc kubenswrapper[4747]: I0930 18:58:22.018460 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96" event={"ID":"c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07","Type":"ContainerStarted","Data":"f1bfd76f6ba50973ef5da4951a8ea48b417541f9aba2465abfdf66b8d145a3a4"} Sep 30 18:58:22 crc kubenswrapper[4747]: I0930 18:58:22.045463 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m7n96" podStartSLOduration=2.50707676 podStartE2EDuration="5.04542391s" podCreationTimestamp="2025-09-30 18:58:17 +0000 UTC" firstStartedPulling="2025-09-30 18:58:18.398532106 +0000 UTC m=+738.058012230" lastFinishedPulling="2025-09-30 18:58:20.936879256 +0000 UTC m=+740.596359380" observedRunningTime="2025-09-30 18:58:22.038707977 +0000 UTC m=+741.698188091" watchObservedRunningTime="2025-09-30 18:58:22.04542391 +0000 UTC m=+741.704904074" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.468114 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.473053 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.484975 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-9gvgp" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.494306 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-hr477"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.495595 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.500568 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.501622 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.504726 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-hr477"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.539306 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-r5l5k"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.539982 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.611385 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggl6\" (UniqueName: \"kubernetes.io/projected/b5822edc-1696-439a-b703-5dbc2720e0aa-kube-api-access-8ggl6\") pod \"nmstate-metrics-58fcddf996-bh6p4\" (UID: \"b5822edc-1696-439a-b703-5dbc2720e0aa\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.611455 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f64zz\" (UniqueName: \"kubernetes.io/projected/5997966c-5225-4adf-a02c-7ed6788335c2-kube-api-access-f64zz\") pod \"nmstate-webhook-6d689559c5-hr477\" (UID: \"5997966c-5225-4adf-a02c-7ed6788335c2\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.611487 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5997966c-5225-4adf-a02c-7ed6788335c2-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-hr477\" (UID: \"5997966c-5225-4adf-a02c-7ed6788335c2\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.614131 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.614788 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.618725 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.619003 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8h8kt" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.619182 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.627580 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.712566 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkcst\" (UniqueName: \"kubernetes.io/projected/11d1c978-5e5f-4957-9963-2194e68f6cd7-kube-api-access-tkcst\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.712624 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f64zz\" (UniqueName: \"kubernetes.io/projected/5997966c-5225-4adf-a02c-7ed6788335c2-kube-api-access-f64zz\") pod \"nmstate-webhook-6d689559c5-hr477\" (UID: \"5997966c-5225-4adf-a02c-7ed6788335c2\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.712647 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-ovs-socket\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.712854 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-dbus-socket\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.712899 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5997966c-5225-4adf-a02c-7ed6788335c2-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-hr477\" (UID: \"5997966c-5225-4adf-a02c-7ed6788335c2\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.712959 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggl6\" (UniqueName: \"kubernetes.io/projected/b5822edc-1696-439a-b703-5dbc2720e0aa-kube-api-access-8ggl6\") pod \"nmstate-metrics-58fcddf996-bh6p4\" (UID: \"b5822edc-1696-439a-b703-5dbc2720e0aa\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.712985 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-nmstate-lock\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: E0930 18:58:26.713135 4747 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 18:58:26 crc kubenswrapper[4747]: E0930 18:58:26.713240 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5997966c-5225-4adf-a02c-7ed6788335c2-tls-key-pair podName:5997966c-5225-4adf-a02c-7ed6788335c2 nodeName:}" failed. No retries permitted until 2025-09-30 18:58:27.213217255 +0000 UTC m=+746.872697369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5997966c-5225-4adf-a02c-7ed6788335c2-tls-key-pair") pod "nmstate-webhook-6d689559c5-hr477" (UID: "5997966c-5225-4adf-a02c-7ed6788335c2") : secret "openshift-nmstate-webhook" not found Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.733975 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f64zz\" (UniqueName: \"kubernetes.io/projected/5997966c-5225-4adf-a02c-7ed6788335c2-kube-api-access-f64zz\") pod \"nmstate-webhook-6d689559c5-hr477\" (UID: \"5997966c-5225-4adf-a02c-7ed6788335c2\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.740806 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggl6\" (UniqueName: \"kubernetes.io/projected/b5822edc-1696-439a-b703-5dbc2720e0aa-kube-api-access-8ggl6\") pod \"nmstate-metrics-58fcddf996-bh6p4\" (UID: \"b5822edc-1696-439a-b703-5dbc2720e0aa\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.797532 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6857d79cbf-vl8kz"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.798177 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.809117 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.809328 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6857d79cbf-vl8kz"] Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.813827 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/874598d6-d91d-40d3-b313-e383cf6410e7-console-serving-cert\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.813870 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-nmstate-lock\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.813903 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqwkn\" (UniqueName: \"kubernetes.io/projected/e3ab74aa-6201-4f76-b93a-04339dca4de7-kube-api-access-cqwkn\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.813960 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-nmstate-lock\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.813965 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3ab74aa-6201-4f76-b93a-04339dca4de7-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814044 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkcst\" (UniqueName: \"kubernetes.io/projected/11d1c978-5e5f-4957-9963-2194e68f6cd7-kube-api-access-tkcst\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814068 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk5jh\" (UniqueName: \"kubernetes.io/projected/874598d6-d91d-40d3-b313-e383cf6410e7-kube-api-access-mk5jh\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814096 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-service-ca\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814121 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-ovs-socket\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814142 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ab74aa-6201-4f76-b93a-04339dca4de7-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814162 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-oauth-serving-cert\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814184 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-dbus-socket\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814202 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-trusted-ca-bundle\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814219 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-console-config\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814258 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/874598d6-d91d-40d3-b313-e383cf6410e7-console-oauth-config\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814553 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-ovs-socket\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.814841 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/11d1c978-5e5f-4957-9963-2194e68f6cd7-dbus-socket\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.843323 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkcst\" (UniqueName: \"kubernetes.io/projected/11d1c978-5e5f-4957-9963-2194e68f6cd7-kube-api-access-tkcst\") pod \"nmstate-handler-r5l5k\" (UID: \"11d1c978-5e5f-4957-9963-2194e68f6cd7\") " pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.867181 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915028 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqwkn\" (UniqueName: \"kubernetes.io/projected/e3ab74aa-6201-4f76-b93a-04339dca4de7-kube-api-access-cqwkn\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915075 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3ab74aa-6201-4f76-b93a-04339dca4de7-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915114 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk5jh\" (UniqueName: \"kubernetes.io/projected/874598d6-d91d-40d3-b313-e383cf6410e7-kube-api-access-mk5jh\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915149 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-service-ca\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915185 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ab74aa-6201-4f76-b93a-04339dca4de7-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915211 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-oauth-serving-cert\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915249 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-trusted-ca-bundle\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915273 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-console-config\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/874598d6-d91d-40d3-b313-e383cf6410e7-console-oauth-config\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.915356 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/874598d6-d91d-40d3-b313-e383cf6410e7-console-serving-cert\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.916389 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3ab74aa-6201-4f76-b93a-04339dca4de7-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.916420 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-service-ca\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.917157 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-oauth-serving-cert\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.917229 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-console-config\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.918152 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/874598d6-d91d-40d3-b313-e383cf6410e7-trusted-ca-bundle\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.920478 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/874598d6-d91d-40d3-b313-e383cf6410e7-console-oauth-config\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.923949 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ab74aa-6201-4f76-b93a-04339dca4de7-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.926416 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/874598d6-d91d-40d3-b313-e383cf6410e7-console-serving-cert\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.941382 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqwkn\" (UniqueName: \"kubernetes.io/projected/e3ab74aa-6201-4f76-b93a-04339dca4de7-kube-api-access-cqwkn\") pod \"nmstate-console-plugin-864bb6dfb5-c6cft\" (UID: \"e3ab74aa-6201-4f76-b93a-04339dca4de7\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:26 crc kubenswrapper[4747]: I0930 18:58:26.957593 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk5jh\" (UniqueName: \"kubernetes.io/projected/874598d6-d91d-40d3-b313-e383cf6410e7-kube-api-access-mk5jh\") pod \"console-6857d79cbf-vl8kz\" (UID: \"874598d6-d91d-40d3-b313-e383cf6410e7\") " pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.013557 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4"] Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.055019 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-r5l5k" event={"ID":"11d1c978-5e5f-4957-9963-2194e68f6cd7","Type":"ContainerStarted","Data":"3e9b57fec130d533382c43c269dcb89ed31f9f190d0c87e9c64dca4618509fb3"} Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.056039 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" event={"ID":"b5822edc-1696-439a-b703-5dbc2720e0aa","Type":"ContainerStarted","Data":"8f52d37ab53c1e8e1190e4961d1a9a3b5759efd80d74f4ab1b11a4f3aa0e2387"} Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.111755 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.218062 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5997966c-5225-4adf-a02c-7ed6788335c2-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-hr477\" (UID: \"5997966c-5225-4adf-a02c-7ed6788335c2\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.222587 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5997966c-5225-4adf-a02c-7ed6788335c2-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-hr477\" (UID: \"5997966c-5225-4adf-a02c-7ed6788335c2\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.227408 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.313784 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6857d79cbf-vl8kz"] Sep 30 18:58:27 crc kubenswrapper[4747]: W0930 18:58:27.317947 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874598d6_d91d_40d3_b313_e383cf6410e7.slice/crio-12cef476c7ea7aaf1e8a84756352930516c4676f4e1492ac8f26e0c5b9f3c7be WatchSource:0}: Error finding container 12cef476c7ea7aaf1e8a84756352930516c4676f4e1492ac8f26e0c5b9f3c7be: Status 404 returned error can't find the container with id 12cef476c7ea7aaf1e8a84756352930516c4676f4e1492ac8f26e0c5b9f3c7be Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.415278 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.427585 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft"] Sep 30 18:58:27 crc kubenswrapper[4747]: W0930 18:58:27.434986 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ab74aa_6201_4f76_b93a_04339dca4de7.slice/crio-f96a7821f961d7971838c58aa4f38c3f111670e211838dbffcd5fb2ca4287430 WatchSource:0}: Error finding container f96a7821f961d7971838c58aa4f38c3f111670e211838dbffcd5fb2ca4287430: Status 404 returned error can't find the container with id f96a7821f961d7971838c58aa4f38c3f111670e211838dbffcd5fb2ca4287430 Sep 30 18:58:27 crc kubenswrapper[4747]: I0930 18:58:27.597642 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-hr477"] Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.069048 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" event={"ID":"5997966c-5225-4adf-a02c-7ed6788335c2","Type":"ContainerStarted","Data":"5be84c803dd27a0b1b3d4ab147d34f004a0d0c0c11236e13ad09eabd82f5b209"} Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.073522 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6857d79cbf-vl8kz" event={"ID":"874598d6-d91d-40d3-b313-e383cf6410e7","Type":"ContainerStarted","Data":"1fca00ad7511e529d277d4b9ab8999a45a83f7fe89937dbf1a2e87f320790760"} Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.073585 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6857d79cbf-vl8kz" event={"ID":"874598d6-d91d-40d3-b313-e383cf6410e7","Type":"ContainerStarted","Data":"12cef476c7ea7aaf1e8a84756352930516c4676f4e1492ac8f26e0c5b9f3c7be"} Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.075214 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmpsg"] Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.076013 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" event={"ID":"e3ab74aa-6201-4f76-b93a-04339dca4de7","Type":"ContainerStarted","Data":"f96a7821f961d7971838c58aa4f38c3f111670e211838dbffcd5fb2ca4287430"} Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.076770 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" podUID="05ea429b-cd6a-466f-a2ff-d469a1ed572c" containerName="controller-manager" containerID="cri-o://69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7" gracePeriod=30 Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.107749 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6857d79cbf-vl8kz" podStartSLOduration=2.107731977 podStartE2EDuration="2.107731977s" podCreationTimestamp="2025-09-30 18:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:58:28.106668316 +0000 UTC m=+747.766148440" watchObservedRunningTime="2025-09-30 18:58:28.107731977 +0000 UTC m=+747.767212111" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.173635 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt"] Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.173827 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" podUID="00a32b13-e38c-424a-8db2-92ea1032208b" containerName="route-controller-manager" containerID="cri-o://79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315" gracePeriod=30 Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.539539 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.589440 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.674050 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-client-ca\") pod \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.674102 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-proxy-ca-bundles\") pod \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.674172 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptgfx\" (UniqueName: \"kubernetes.io/projected/05ea429b-cd6a-466f-a2ff-d469a1ed572c-kube-api-access-ptgfx\") pod \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.675036 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "05ea429b-cd6a-466f-a2ff-d469a1ed572c" (UID: "05ea429b-cd6a-466f-a2ff-d469a1ed572c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.675057 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-client-ca" (OuterVolumeSpecName: "client-ca") pod "05ea429b-cd6a-466f-a2ff-d469a1ed572c" (UID: "05ea429b-cd6a-466f-a2ff-d469a1ed572c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.675251 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert\") pod \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.675337 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config\") pod \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\" (UID: \"05ea429b-cd6a-466f-a2ff-d469a1ed572c\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.675785 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.675807 4747 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.676243 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config" (OuterVolumeSpecName: "config") pod "05ea429b-cd6a-466f-a2ff-d469a1ed572c" (UID: "05ea429b-cd6a-466f-a2ff-d469a1ed572c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.679588 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "05ea429b-cd6a-466f-a2ff-d469a1ed572c" (UID: "05ea429b-cd6a-466f-a2ff-d469a1ed572c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.686619 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ea429b-cd6a-466f-a2ff-d469a1ed572c-kube-api-access-ptgfx" (OuterVolumeSpecName: "kube-api-access-ptgfx") pod "05ea429b-cd6a-466f-a2ff-d469a1ed572c" (UID: "05ea429b-cd6a-466f-a2ff-d469a1ed572c"). InnerVolumeSpecName "kube-api-access-ptgfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.776077 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-client-ca\") pod \"00a32b13-e38c-424a-8db2-92ea1032208b\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.776117 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-config\") pod \"00a32b13-e38c-424a-8db2-92ea1032208b\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.776148 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a32b13-e38c-424a-8db2-92ea1032208b-serving-cert\") pod \"00a32b13-e38c-424a-8db2-92ea1032208b\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.776177 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkhb9\" (UniqueName: \"kubernetes.io/projected/00a32b13-e38c-424a-8db2-92ea1032208b-kube-api-access-lkhb9\") pod \"00a32b13-e38c-424a-8db2-92ea1032208b\" (UID: \"00a32b13-e38c-424a-8db2-92ea1032208b\") " Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.776328 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptgfx\" (UniqueName: \"kubernetes.io/projected/05ea429b-cd6a-466f-a2ff-d469a1ed572c-kube-api-access-ptgfx\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.776358 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05ea429b-cd6a-466f-a2ff-d469a1ed572c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.776367 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ea429b-cd6a-466f-a2ff-d469a1ed572c-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.776903 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-client-ca" (OuterVolumeSpecName: "client-ca") pod "00a32b13-e38c-424a-8db2-92ea1032208b" (UID: "00a32b13-e38c-424a-8db2-92ea1032208b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.777680 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-config" (OuterVolumeSpecName: "config") pod "00a32b13-e38c-424a-8db2-92ea1032208b" (UID: "00a32b13-e38c-424a-8db2-92ea1032208b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.780032 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a32b13-e38c-424a-8db2-92ea1032208b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00a32b13-e38c-424a-8db2-92ea1032208b" (UID: "00a32b13-e38c-424a-8db2-92ea1032208b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.797806 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a32b13-e38c-424a-8db2-92ea1032208b-kube-api-access-lkhb9" (OuterVolumeSpecName: "kube-api-access-lkhb9") pod "00a32b13-e38c-424a-8db2-92ea1032208b" (UID: "00a32b13-e38c-424a-8db2-92ea1032208b"). InnerVolumeSpecName "kube-api-access-lkhb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.876998 4747 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a32b13-e38c-424a-8db2-92ea1032208b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.877027 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkhb9\" (UniqueName: \"kubernetes.io/projected/00a32b13-e38c-424a-8db2-92ea1032208b-kube-api-access-lkhb9\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.877036 4747 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:28 crc kubenswrapper[4747]: I0930 18:58:28.877045 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a32b13-e38c-424a-8db2-92ea1032208b-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.081933 4747 generic.go:334] "Generic (PLEG): container finished" podID="00a32b13-e38c-424a-8db2-92ea1032208b" containerID="79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315" exitCode=0 Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.081999 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.082022 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" event={"ID":"00a32b13-e38c-424a-8db2-92ea1032208b","Type":"ContainerDied","Data":"79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315"} Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.082045 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt" event={"ID":"00a32b13-e38c-424a-8db2-92ea1032208b","Type":"ContainerDied","Data":"4c5fee074aecd396eff8d9edf8d1c6defa98e2d83a2a980517d203ea4f0ad11f"} Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.082081 4747 scope.go:117] "RemoveContainer" containerID="79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.085709 4747 generic.go:334] "Generic (PLEG): container finished" podID="05ea429b-cd6a-466f-a2ff-d469a1ed572c" containerID="69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7" exitCode=0 Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.085751 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.085824 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" event={"ID":"05ea429b-cd6a-466f-a2ff-d469a1ed572c","Type":"ContainerDied","Data":"69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7"} Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.085897 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nmpsg" event={"ID":"05ea429b-cd6a-466f-a2ff-d469a1ed572c","Type":"ContainerDied","Data":"538502bdb9208238c6326e90eee785b69c78f045d6bee25d107b8addfbaaeb3b"} Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.109839 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt"] Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.119808 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vcdjt"] Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.123358 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmpsg"] Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.126445 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nmpsg"] Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.663528 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6"] Sep 30 18:58:29 crc kubenswrapper[4747]: E0930 18:58:29.663760 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a32b13-e38c-424a-8db2-92ea1032208b" containerName="route-controller-manager" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.663776 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a32b13-e38c-424a-8db2-92ea1032208b" containerName="route-controller-manager" Sep 30 18:58:29 crc kubenswrapper[4747]: E0930 18:58:29.663798 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ea429b-cd6a-466f-a2ff-d469a1ed572c" containerName="controller-manager" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.663807 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ea429b-cd6a-466f-a2ff-d469a1ed572c" containerName="controller-manager" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.663917 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a32b13-e38c-424a-8db2-92ea1032208b" containerName="route-controller-manager" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.663952 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ea429b-cd6a-466f-a2ff-d469a1ed572c" containerName="controller-manager" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.664348 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.667545 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.667655 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.667699 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.667699 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.667674 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.667669 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.684879 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6"] Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.699539 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abcd04b-d774-4463-b573-89684c062b26-serving-cert\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.699966 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abcd04b-d774-4463-b573-89684c062b26-config\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.700042 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3abcd04b-d774-4463-b573-89684c062b26-client-ca\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.700073 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr8zn\" (UniqueName: \"kubernetes.io/projected/3abcd04b-d774-4463-b573-89684c062b26-kube-api-access-pr8zn\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.802108 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr8zn\" (UniqueName: \"kubernetes.io/projected/3abcd04b-d774-4463-b573-89684c062b26-kube-api-access-pr8zn\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.802201 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abcd04b-d774-4463-b573-89684c062b26-serving-cert\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.802235 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abcd04b-d774-4463-b573-89684c062b26-config\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.802307 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3abcd04b-d774-4463-b573-89684c062b26-client-ca\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.803603 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3abcd04b-d774-4463-b573-89684c062b26-client-ca\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.803761 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abcd04b-d774-4463-b573-89684c062b26-config\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.813279 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abcd04b-d774-4463-b573-89684c062b26-serving-cert\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.829373 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr8zn\" (UniqueName: \"kubernetes.io/projected/3abcd04b-d774-4463-b573-89684c062b26-kube-api-access-pr8zn\") pod \"route-controller-manager-7cb67459d6-9brx6\" (UID: \"3abcd04b-d774-4463-b573-89684c062b26\") " pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.985739 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.989285 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s"] Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.990067 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.992395 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.993469 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.993676 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.993805 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.993986 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.994260 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 18:58:29 crc kubenswrapper[4747]: I0930 18:58:29.998969 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s"] Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.003332 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.008808 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqq9c\" (UniqueName: \"kubernetes.io/projected/f5148551-71ac-464b-bc2e-72dab57dc5d7-kube-api-access-rqq9c\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.009114 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-proxy-ca-bundles\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.009159 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-client-ca\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.009439 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-config\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.009515 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5148551-71ac-464b-bc2e-72dab57dc5d7-serving-cert\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.110391 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-proxy-ca-bundles\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.110503 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-client-ca\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.110553 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-config\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.110644 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5148551-71ac-464b-bc2e-72dab57dc5d7-serving-cert\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.110773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqq9c\" (UniqueName: \"kubernetes.io/projected/f5148551-71ac-464b-bc2e-72dab57dc5d7-kube-api-access-rqq9c\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.112226 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-client-ca\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.113103 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-proxy-ca-bundles\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.113573 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5148551-71ac-464b-bc2e-72dab57dc5d7-config\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.117509 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5148551-71ac-464b-bc2e-72dab57dc5d7-serving-cert\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.125799 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqq9c\" (UniqueName: \"kubernetes.io/projected/f5148551-71ac-464b-bc2e-72dab57dc5d7-kube-api-access-rqq9c\") pod \"controller-manager-5bd7b89df4-d2n6s\" (UID: \"f5148551-71ac-464b-bc2e-72dab57dc5d7\") " pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.326371 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.521092 4747 scope.go:117] "RemoveContainer" containerID="79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315" Sep 30 18:58:30 crc kubenswrapper[4747]: E0930 18:58:30.527734 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315\": container with ID starting with 79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315 not found: ID does not exist" containerID="79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.527773 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315"} err="failed to get container status \"79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315\": rpc error: code = NotFound desc = could not find container \"79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315\": container with ID starting with 79fa86ddd1099f797bcc547f747a842330e226a099e647cbe3f8177faf07b315 not found: ID does not exist" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.527799 4747 scope.go:117] "RemoveContainer" containerID="69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.634349 4747 scope.go:117] "RemoveContainer" containerID="69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7" Sep 30 18:58:30 crc kubenswrapper[4747]: E0930 18:58:30.634828 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7\": container with ID starting with 69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7 not found: ID does not exist" containerID="69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.634870 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7"} err="failed to get container status \"69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7\": rpc error: code = NotFound desc = could not find container \"69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7\": container with ID starting with 69c8237253670b842ec229a6a4f6f49b035c27ccccd266b7d33ef7f5c6bb44c7 not found: ID does not exist" Sep 30 18:58:30 crc kubenswrapper[4747]: I0930 18:58:30.882074 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6"] Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.096710 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a32b13-e38c-424a-8db2-92ea1032208b" path="/var/lib/kubelet/pods/00a32b13-e38c-424a-8db2-92ea1032208b/volumes" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.097494 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ea429b-cd6a-466f-a2ff-d469a1ed572c" path="/var/lib/kubelet/pods/05ea429b-cd6a-466f-a2ff-d469a1ed572c/volumes" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.097946 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-r5l5k" event={"ID":"11d1c978-5e5f-4957-9963-2194e68f6cd7","Type":"ContainerStarted","Data":"a69e3c702c2574b10e2777739156da089d4a283a4306d48a388cb830a489503a"} Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.097979 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.098415 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" event={"ID":"3abcd04b-d774-4463-b573-89684c062b26","Type":"ContainerStarted","Data":"30e7c8e274c58ec8144f12dc0273b126376146c2efb1b15de9dfa154ca7882bb"} Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.098439 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" event={"ID":"3abcd04b-d774-4463-b573-89684c062b26","Type":"ContainerStarted","Data":"ad8b4f80796fb656465756e227507252b12489bd781a87360c153e10c550ae44"} Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.098584 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.100957 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" event={"ID":"b5822edc-1696-439a-b703-5dbc2720e0aa","Type":"ContainerStarted","Data":"bec30fd1e8a0691ab058bcb9bb22b3c3a27d927eca51c49b65ae2eca3b4df934"} Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.102046 4747 patch_prober.go:28] interesting pod/route-controller-manager-7cb67459d6-9brx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.102093 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" podUID="3abcd04b-d774-4463-b573-89684c062b26" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.102435 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" event={"ID":"e3ab74aa-6201-4f76-b93a-04339dca4de7","Type":"ContainerStarted","Data":"81bc9a7ed32cd645fe8e9ba15416b6ddbcc0945cabe13e05a4eb2c8d25561701"} Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.104872 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" event={"ID":"5997966c-5225-4adf-a02c-7ed6788335c2","Type":"ContainerStarted","Data":"e907488d1b8c29ea0e051780da93829f6e9fb1a6aaa289f3a2f76885e821f7d6"} Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.105060 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.180605 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" podStartSLOduration=2.180583515 podStartE2EDuration="2.180583515s" podCreationTimestamp="2025-09-30 18:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:58:31.176864358 +0000 UTC m=+750.836344482" watchObservedRunningTime="2025-09-30 18:58:31.180583515 +0000 UTC m=+750.840063629" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.181056 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s"] Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.199204 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-c6cft" podStartSLOduration=1.969562402 podStartE2EDuration="5.199185211s" podCreationTimestamp="2025-09-30 18:58:26 +0000 UTC" firstStartedPulling="2025-09-30 18:58:27.436896826 +0000 UTC m=+747.096376940" lastFinishedPulling="2025-09-30 18:58:30.666519635 +0000 UTC m=+750.325999749" observedRunningTime="2025-09-30 18:58:31.196826313 +0000 UTC m=+750.856306427" watchObservedRunningTime="2025-09-30 18:58:31.199185211 +0000 UTC m=+750.858665315" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.221619 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-r5l5k" podStartSLOduration=1.4565009899999999 podStartE2EDuration="5.221592645s" podCreationTimestamp="2025-09-30 18:58:26 +0000 UTC" firstStartedPulling="2025-09-30 18:58:26.900213525 +0000 UTC m=+746.559693639" lastFinishedPulling="2025-09-30 18:58:30.66530517 +0000 UTC m=+750.324785294" observedRunningTime="2025-09-30 18:58:31.219699021 +0000 UTC m=+750.879179145" watchObservedRunningTime="2025-09-30 18:58:31.221592645 +0000 UTC m=+750.881072759" Sep 30 18:58:31 crc kubenswrapper[4747]: I0930 18:58:31.241448 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" podStartSLOduration=2.174801827 podStartE2EDuration="5.241426326s" podCreationTimestamp="2025-09-30 18:58:26 +0000 UTC" firstStartedPulling="2025-09-30 18:58:27.610158881 +0000 UTC m=+747.269638995" lastFinishedPulling="2025-09-30 18:58:30.67678336 +0000 UTC m=+750.336263494" observedRunningTime="2025-09-30 18:58:31.238465671 +0000 UTC m=+750.897945795" watchObservedRunningTime="2025-09-30 18:58:31.241426326 +0000 UTC m=+750.900906450" Sep 30 18:58:32 crc kubenswrapper[4747]: I0930 18:58:32.113327 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" event={"ID":"f5148551-71ac-464b-bc2e-72dab57dc5d7","Type":"ContainerStarted","Data":"528d168ce76267dbe481eaf60da1cdcfd2f97d0f2d26b0ee7c5a008a36711c73"} Sep 30 18:58:32 crc kubenswrapper[4747]: I0930 18:58:32.113841 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:32 crc kubenswrapper[4747]: I0930 18:58:32.113887 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" event={"ID":"f5148551-71ac-464b-bc2e-72dab57dc5d7","Type":"ContainerStarted","Data":"f2925d917fed9e862e2981ce363238e682d238c8733943ff7bad5d52a2af5048"} Sep 30 18:58:32 crc kubenswrapper[4747]: I0930 18:58:32.123103 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" Sep 30 18:58:32 crc kubenswrapper[4747]: I0930 18:58:32.123869 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cb67459d6-9brx6" Sep 30 18:58:32 crc kubenswrapper[4747]: I0930 18:58:32.145845 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bd7b89df4-d2n6s" podStartSLOduration=4.145823736 podStartE2EDuration="4.145823736s" podCreationTimestamp="2025-09-30 18:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:58:32.14351208 +0000 UTC m=+751.802992194" watchObservedRunningTime="2025-09-30 18:58:32.145823736 +0000 UTC m=+751.805303860" Sep 30 18:58:34 crc kubenswrapper[4747]: I0930 18:58:34.134020 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" event={"ID":"b5822edc-1696-439a-b703-5dbc2720e0aa","Type":"ContainerStarted","Data":"0802ff17378111405403b7344a35e038379d34bec4369a25a90efbc0097f20f5"} Sep 30 18:58:34 crc kubenswrapper[4747]: I0930 18:58:34.165521 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-bh6p4" podStartSLOduration=1.853943524 podStartE2EDuration="8.165493474s" podCreationTimestamp="2025-09-30 18:58:26 +0000 UTC" firstStartedPulling="2025-09-30 18:58:27.028674911 +0000 UTC m=+746.688155025" lastFinishedPulling="2025-09-30 18:58:33.340224861 +0000 UTC m=+752.999704975" observedRunningTime="2025-09-30 18:58:34.164696201 +0000 UTC m=+753.824176325" watchObservedRunningTime="2025-09-30 18:58:34.165493474 +0000 UTC m=+753.824973628" Sep 30 18:58:35 crc kubenswrapper[4747]: I0930 18:58:35.731501 4747 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 18:58:36 crc kubenswrapper[4747]: I0930 18:58:36.903273 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-r5l5k" Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.111892 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.111961 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.119690 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.162004 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6857d79cbf-vl8kz" Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.210228 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sn6w4"] Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.656124 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.656281 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.656396 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.657742 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3b9f45b84cc1eae815bcdc0ad8efb2eb78da9ac4324427d149fbbf26250b353"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:58:37 crc kubenswrapper[4747]: I0930 18:58:37.657858 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://f3b9f45b84cc1eae815bcdc0ad8efb2eb78da9ac4324427d149fbbf26250b353" gracePeriod=600 Sep 30 18:58:38 crc kubenswrapper[4747]: I0930 18:58:38.164665 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="f3b9f45b84cc1eae815bcdc0ad8efb2eb78da9ac4324427d149fbbf26250b353" exitCode=0 Sep 30 18:58:38 crc kubenswrapper[4747]: I0930 18:58:38.164770 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"f3b9f45b84cc1eae815bcdc0ad8efb2eb78da9ac4324427d149fbbf26250b353"} Sep 30 18:58:38 crc kubenswrapper[4747]: I0930 18:58:38.165702 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"9324a2c247fa6850748cdd90467f095bd666e1119af1ca69c9f4d4385e9867bb"} Sep 30 18:58:38 crc kubenswrapper[4747]: I0930 18:58:38.165750 4747 scope.go:117] "RemoveContainer" containerID="d543f08c59a323444d6e9001d7802b512e4a86e59b1b1b4efef8d96bd15c0e26" Sep 30 18:58:47 crc kubenswrapper[4747]: I0930 18:58:47.430142 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-hr477" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.284997 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-sn6w4" podUID="5f90c236-a235-4782-8351-cad3bb90e3fa" containerName="console" containerID="cri-o://f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4" gracePeriod=15 Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.739771 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sn6w4_5f90c236-a235-4782-8351-cad3bb90e3fa/console/0.log" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.740096 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.822262 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert\") pod \"5f90c236-a235-4782-8351-cad3bb90e3fa\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.822320 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-service-ca\") pod \"5f90c236-a235-4782-8351-cad3bb90e3fa\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.822371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-oauth-config\") pod \"5f90c236-a235-4782-8351-cad3bb90e3fa\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.822415 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-console-config\") pod \"5f90c236-a235-4782-8351-cad3bb90e3fa\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.822452 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-trusted-ca-bundle\") pod \"5f90c236-a235-4782-8351-cad3bb90e3fa\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.822481 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chr44\" (UniqueName: \"kubernetes.io/projected/5f90c236-a235-4782-8351-cad3bb90e3fa-kube-api-access-chr44\") pod \"5f90c236-a235-4782-8351-cad3bb90e3fa\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.822534 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-serving-cert\") pod \"5f90c236-a235-4782-8351-cad3bb90e3fa\" (UID: \"5f90c236-a235-4782-8351-cad3bb90e3fa\") " Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.822962 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5f90c236-a235-4782-8351-cad3bb90e3fa" (UID: "5f90c236-a235-4782-8351-cad3bb90e3fa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.823576 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-service-ca" (OuterVolumeSpecName: "service-ca") pod "5f90c236-a235-4782-8351-cad3bb90e3fa" (UID: "5f90c236-a235-4782-8351-cad3bb90e3fa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.823586 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5f90c236-a235-4782-8351-cad3bb90e3fa" (UID: "5f90c236-a235-4782-8351-cad3bb90e3fa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.823570 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-console-config" (OuterVolumeSpecName: "console-config") pod "5f90c236-a235-4782-8351-cad3bb90e3fa" (UID: "5f90c236-a235-4782-8351-cad3bb90e3fa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.843092 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5f90c236-a235-4782-8351-cad3bb90e3fa" (UID: "5f90c236-a235-4782-8351-cad3bb90e3fa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.844022 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f90c236-a235-4782-8351-cad3bb90e3fa-kube-api-access-chr44" (OuterVolumeSpecName: "kube-api-access-chr44") pod "5f90c236-a235-4782-8351-cad3bb90e3fa" (UID: "5f90c236-a235-4782-8351-cad3bb90e3fa"). InnerVolumeSpecName "kube-api-access-chr44". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.844297 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5f90c236-a235-4782-8351-cad3bb90e3fa" (UID: "5f90c236-a235-4782-8351-cad3bb90e3fa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.924018 4747 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.924245 4747 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.924256 4747 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.924264 4747 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f90c236-a235-4782-8351-cad3bb90e3fa-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.924273 4747 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.924280 4747 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f90c236-a235-4782-8351-cad3bb90e3fa-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:02 crc kubenswrapper[4747]: I0930 18:59:02.924288 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chr44\" (UniqueName: \"kubernetes.io/projected/5f90c236-a235-4782-8351-cad3bb90e3fa-kube-api-access-chr44\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.337564 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs"] Sep 30 18:59:03 crc kubenswrapper[4747]: E0930 18:59:03.338026 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f90c236-a235-4782-8351-cad3bb90e3fa" containerName="console" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.338053 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f90c236-a235-4782-8351-cad3bb90e3fa" containerName="console" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.338284 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f90c236-a235-4782-8351-cad3bb90e3fa" containerName="console" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.339655 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.342861 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.346972 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs"] Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.396883 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sn6w4_5f90c236-a235-4782-8351-cad3bb90e3fa/console/0.log" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.396990 4747 generic.go:334] "Generic (PLEG): container finished" podID="5f90c236-a235-4782-8351-cad3bb90e3fa" containerID="f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4" exitCode=2 Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.397033 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sn6w4" event={"ID":"5f90c236-a235-4782-8351-cad3bb90e3fa","Type":"ContainerDied","Data":"f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4"} Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.397074 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sn6w4" event={"ID":"5f90c236-a235-4782-8351-cad3bb90e3fa","Type":"ContainerDied","Data":"41a27d36da8c39dcb547d5eed494bc54d3c97dbf24a3af90d8d4a9235d4fda25"} Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.397105 4747 scope.go:117] "RemoveContainer" containerID="f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.397554 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sn6w4" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.420084 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sn6w4"] Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.422505 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-sn6w4"] Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.425982 4747 scope.go:117] "RemoveContainer" containerID="f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4" Sep 30 18:59:03 crc kubenswrapper[4747]: E0930 18:59:03.426450 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4\": container with ID starting with f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4 not found: ID does not exist" containerID="f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.426495 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4"} err="failed to get container status \"f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4\": rpc error: code = NotFound desc = could not find container \"f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4\": container with ID starting with f026269eec847a0e1d2c836eb5f6aae2ee7bd298b6b08a07c492bb8ef4a29ac4 not found: ID does not exist" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.432372 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.432429 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvn8g\" (UniqueName: \"kubernetes.io/projected/5a83c062-1bc6-4c6a-83cf-40064d73606f-kube-api-access-zvn8g\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.432481 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.534201 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.534304 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvn8g\" (UniqueName: \"kubernetes.io/projected/5a83c062-1bc6-4c6a-83cf-40064d73606f-kube-api-access-zvn8g\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.534397 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.535199 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.535289 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.561109 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvn8g\" (UniqueName: \"kubernetes.io/projected/5a83c062-1bc6-4c6a-83cf-40064d73606f-kube-api-access-zvn8g\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:03 crc kubenswrapper[4747]: I0930 18:59:03.667513 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:04 crc kubenswrapper[4747]: I0930 18:59:04.139195 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs"] Sep 30 18:59:04 crc kubenswrapper[4747]: W0930 18:59:04.163419 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a83c062_1bc6_4c6a_83cf_40064d73606f.slice/crio-449e79db515d7bcbf31abd11d994d50c1585927485a6158ccd0676a0090f13dd WatchSource:0}: Error finding container 449e79db515d7bcbf31abd11d994d50c1585927485a6158ccd0676a0090f13dd: Status 404 returned error can't find the container with id 449e79db515d7bcbf31abd11d994d50c1585927485a6158ccd0676a0090f13dd Sep 30 18:59:04 crc kubenswrapper[4747]: I0930 18:59:04.404856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" event={"ID":"5a83c062-1bc6-4c6a-83cf-40064d73606f","Type":"ContainerStarted","Data":"b430f281956f6a3d9f27181d16036634f46ca71ac67f83db72828c25e17b2c65"} Sep 30 18:59:04 crc kubenswrapper[4747]: I0930 18:59:04.404916 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" event={"ID":"5a83c062-1bc6-4c6a-83cf-40064d73606f","Type":"ContainerStarted","Data":"449e79db515d7bcbf31abd11d994d50c1585927485a6158ccd0676a0090f13dd"} Sep 30 18:59:05 crc kubenswrapper[4747]: I0930 18:59:05.099970 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f90c236-a235-4782-8351-cad3bb90e3fa" path="/var/lib/kubelet/pods/5f90c236-a235-4782-8351-cad3bb90e3fa/volumes" Sep 30 18:59:05 crc kubenswrapper[4747]: I0930 18:59:05.416945 4747 generic.go:334] "Generic (PLEG): container finished" podID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerID="b430f281956f6a3d9f27181d16036634f46ca71ac67f83db72828c25e17b2c65" exitCode=0 Sep 30 18:59:05 crc kubenswrapper[4747]: I0930 18:59:05.417002 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" event={"ID":"5a83c062-1bc6-4c6a-83cf-40064d73606f","Type":"ContainerDied","Data":"b430f281956f6a3d9f27181d16036634f46ca71ac67f83db72828c25e17b2c65"} Sep 30 18:59:06 crc kubenswrapper[4747]: I0930 18:59:06.877682 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b6kbv"] Sep 30 18:59:06 crc kubenswrapper[4747]: I0930 18:59:06.886960 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:06 crc kubenswrapper[4747]: I0930 18:59:06.915209 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6kbv"] Sep 30 18:59:06 crc kubenswrapper[4747]: I0930 18:59:06.994506 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-catalog-content\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:06 crc kubenswrapper[4747]: I0930 18:59:06.994556 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-utilities\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:06 crc kubenswrapper[4747]: I0930 18:59:06.994592 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdqjj\" (UniqueName: \"kubernetes.io/projected/1f2adf77-5bdd-459b-8829-f61c0cc70793-kube-api-access-kdqjj\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:07 crc kubenswrapper[4747]: I0930 18:59:07.095475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-catalog-content\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:07 crc kubenswrapper[4747]: I0930 18:59:07.095530 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-utilities\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:07 crc kubenswrapper[4747]: I0930 18:59:07.095569 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdqjj\" (UniqueName: \"kubernetes.io/projected/1f2adf77-5bdd-459b-8829-f61c0cc70793-kube-api-access-kdqjj\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:07 crc kubenswrapper[4747]: I0930 18:59:07.096031 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-catalog-content\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:07 crc kubenswrapper[4747]: I0930 18:59:07.096361 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-utilities\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:07 crc kubenswrapper[4747]: I0930 18:59:07.115667 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdqjj\" (UniqueName: \"kubernetes.io/projected/1f2adf77-5bdd-459b-8829-f61c0cc70793-kube-api-access-kdqjj\") pod \"redhat-operators-b6kbv\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:07 crc kubenswrapper[4747]: I0930 18:59:07.220235 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:07 crc kubenswrapper[4747]: I0930 18:59:07.681469 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6kbv"] Sep 30 18:59:07 crc kubenswrapper[4747]: W0930 18:59:07.690912 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2adf77_5bdd_459b_8829_f61c0cc70793.slice/crio-8f2c0a5124e49e183f83cadb89e007206f63abde90fc9d247d4e28c49680115e WatchSource:0}: Error finding container 8f2c0a5124e49e183f83cadb89e007206f63abde90fc9d247d4e28c49680115e: Status 404 returned error can't find the container with id 8f2c0a5124e49e183f83cadb89e007206f63abde90fc9d247d4e28c49680115e Sep 30 18:59:08 crc kubenswrapper[4747]: I0930 18:59:08.439224 4747 generic.go:334] "Generic (PLEG): container finished" podID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerID="80a8d09fa80957f37985592a7002a1f8f051284964893df5e1437834c00ae657" exitCode=0 Sep 30 18:59:08 crc kubenswrapper[4747]: I0930 18:59:08.439336 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" event={"ID":"5a83c062-1bc6-4c6a-83cf-40064d73606f","Type":"ContainerDied","Data":"80a8d09fa80957f37985592a7002a1f8f051284964893df5e1437834c00ae657"} Sep 30 18:59:08 crc kubenswrapper[4747]: I0930 18:59:08.445419 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerID="8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91" exitCode=0 Sep 30 18:59:08 crc kubenswrapper[4747]: I0930 18:59:08.445493 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6kbv" event={"ID":"1f2adf77-5bdd-459b-8829-f61c0cc70793","Type":"ContainerDied","Data":"8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91"} Sep 30 18:59:08 crc kubenswrapper[4747]: I0930 18:59:08.445540 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6kbv" event={"ID":"1f2adf77-5bdd-459b-8829-f61c0cc70793","Type":"ContainerStarted","Data":"8f2c0a5124e49e183f83cadb89e007206f63abde90fc9d247d4e28c49680115e"} Sep 30 18:59:09 crc kubenswrapper[4747]: I0930 18:59:09.457349 4747 generic.go:334] "Generic (PLEG): container finished" podID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerID="c53b13e261aa459e0840d520818b33441ba6461a87fdd32d7de4a12601f60e07" exitCode=0 Sep 30 18:59:09 crc kubenswrapper[4747]: I0930 18:59:09.457408 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" event={"ID":"5a83c062-1bc6-4c6a-83cf-40064d73606f","Type":"ContainerDied","Data":"c53b13e261aa459e0840d520818b33441ba6461a87fdd32d7de4a12601f60e07"} Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.470246 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerID="8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836" exitCode=0 Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.470428 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6kbv" event={"ID":"1f2adf77-5bdd-459b-8829-f61c0cc70793","Type":"ContainerDied","Data":"8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836"} Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.837169 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.948191 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-bundle\") pod \"5a83c062-1bc6-4c6a-83cf-40064d73606f\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.948252 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvn8g\" (UniqueName: \"kubernetes.io/projected/5a83c062-1bc6-4c6a-83cf-40064d73606f-kube-api-access-zvn8g\") pod \"5a83c062-1bc6-4c6a-83cf-40064d73606f\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.948304 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-util\") pod \"5a83c062-1bc6-4c6a-83cf-40064d73606f\" (UID: \"5a83c062-1bc6-4c6a-83cf-40064d73606f\") " Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.949654 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-bundle" (OuterVolumeSpecName: "bundle") pod "5a83c062-1bc6-4c6a-83cf-40064d73606f" (UID: "5a83c062-1bc6-4c6a-83cf-40064d73606f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.956900 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a83c062-1bc6-4c6a-83cf-40064d73606f-kube-api-access-zvn8g" (OuterVolumeSpecName: "kube-api-access-zvn8g") pod "5a83c062-1bc6-4c6a-83cf-40064d73606f" (UID: "5a83c062-1bc6-4c6a-83cf-40064d73606f"). InnerVolumeSpecName "kube-api-access-zvn8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:59:10 crc kubenswrapper[4747]: I0930 18:59:10.961275 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-util" (OuterVolumeSpecName: "util") pod "5a83c062-1bc6-4c6a-83cf-40064d73606f" (UID: "5a83c062-1bc6-4c6a-83cf-40064d73606f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:59:11 crc kubenswrapper[4747]: I0930 18:59:11.049573 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvn8g\" (UniqueName: \"kubernetes.io/projected/5a83c062-1bc6-4c6a-83cf-40064d73606f-kube-api-access-zvn8g\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:11 crc kubenswrapper[4747]: I0930 18:59:11.049596 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-util\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:11 crc kubenswrapper[4747]: I0930 18:59:11.049605 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a83c062-1bc6-4c6a-83cf-40064d73606f-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:11 crc kubenswrapper[4747]: I0930 18:59:11.481554 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" event={"ID":"5a83c062-1bc6-4c6a-83cf-40064d73606f","Type":"ContainerDied","Data":"449e79db515d7bcbf31abd11d994d50c1585927485a6158ccd0676a0090f13dd"} Sep 30 18:59:11 crc kubenswrapper[4747]: I0930 18:59:11.482258 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="449e79db515d7bcbf31abd11d994d50c1585927485a6158ccd0676a0090f13dd" Sep 30 18:59:11 crc kubenswrapper[4747]: I0930 18:59:11.481797 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs" Sep 30 18:59:11 crc kubenswrapper[4747]: I0930 18:59:11.486456 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6kbv" event={"ID":"1f2adf77-5bdd-459b-8829-f61c0cc70793","Type":"ContainerStarted","Data":"8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a"} Sep 30 18:59:11 crc kubenswrapper[4747]: I0930 18:59:11.890631 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b6kbv" podStartSLOduration=3.391575312 podStartE2EDuration="5.89060167s" podCreationTimestamp="2025-09-30 18:59:06 +0000 UTC" firstStartedPulling="2025-09-30 18:59:08.447837839 +0000 UTC m=+788.107317983" lastFinishedPulling="2025-09-30 18:59:10.946864227 +0000 UTC m=+790.606344341" observedRunningTime="2025-09-30 18:59:11.519173433 +0000 UTC m=+791.178653577" watchObservedRunningTime="2025-09-30 18:59:11.89060167 +0000 UTC m=+791.550081824" Sep 30 18:59:17 crc kubenswrapper[4747]: I0930 18:59:17.220675 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:17 crc kubenswrapper[4747]: I0930 18:59:17.221293 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:17 crc kubenswrapper[4747]: I0930 18:59:17.305255 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:17 crc kubenswrapper[4747]: I0930 18:59:17.598392 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.578177 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr"] Sep 30 18:59:20 crc kubenswrapper[4747]: E0930 18:59:20.579439 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerName="pull" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.579547 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerName="pull" Sep 30 18:59:20 crc kubenswrapper[4747]: E0930 18:59:20.579627 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerName="extract" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.579711 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerName="extract" Sep 30 18:59:20 crc kubenswrapper[4747]: E0930 18:59:20.579798 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerName="util" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.579867 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerName="util" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.580078 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a83c062-1bc6-4c6a-83cf-40064d73606f" containerName="extract" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.580662 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.583581 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.583684 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.583822 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-8qz9h" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.584042 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.584165 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.597693 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr"] Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.723940 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndbs8\" (UniqueName: \"kubernetes.io/projected/cb80bc02-e6c8-453b-b63b-5a95783c2520-kube-api-access-ndbs8\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.724020 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb80bc02-e6c8-453b-b63b-5a95783c2520-apiservice-cert\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.724080 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb80bc02-e6c8-453b-b63b-5a95783c2520-webhook-cert\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.825690 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb80bc02-e6c8-453b-b63b-5a95783c2520-webhook-cert\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.825972 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndbs8\" (UniqueName: \"kubernetes.io/projected/cb80bc02-e6c8-453b-b63b-5a95783c2520-kube-api-access-ndbs8\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.826091 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb80bc02-e6c8-453b-b63b-5a95783c2520-apiservice-cert\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.831656 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb80bc02-e6c8-453b-b63b-5a95783c2520-webhook-cert\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.834418 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb80bc02-e6c8-453b-b63b-5a95783c2520-apiservice-cert\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.846779 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndbs8\" (UniqueName: \"kubernetes.io/projected/cb80bc02-e6c8-453b-b63b-5a95783c2520-kube-api-access-ndbs8\") pod \"metallb-operator-controller-manager-d99c7c5b9-tzfsr\" (UID: \"cb80bc02-e6c8-453b-b63b-5a95783c2520\") " pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.901176 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.972718 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28"] Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.973477 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.982291 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.982362 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.982397 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xj7th" Sep 30 18:59:20 crc kubenswrapper[4747]: I0930 18:59:20.994200 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28"] Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.051552 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6kbv"] Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.051990 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b6kbv" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerName="registry-server" containerID="cri-o://8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a" gracePeriod=2 Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.139379 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7zz\" (UniqueName: \"kubernetes.io/projected/e14405c7-e1c8-4713-b14b-58926d71206b-kube-api-access-7p7zz\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.139489 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e14405c7-e1c8-4713-b14b-58926d71206b-apiservice-cert\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.139543 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e14405c7-e1c8-4713-b14b-58926d71206b-webhook-cert\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.142793 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr"] Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.240234 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e14405c7-e1c8-4713-b14b-58926d71206b-apiservice-cert\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.240287 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e14405c7-e1c8-4713-b14b-58926d71206b-webhook-cert\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.240338 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7zz\" (UniqueName: \"kubernetes.io/projected/e14405c7-e1c8-4713-b14b-58926d71206b-kube-api-access-7p7zz\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.244427 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e14405c7-e1c8-4713-b14b-58926d71206b-apiservice-cert\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.244473 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e14405c7-e1c8-4713-b14b-58926d71206b-webhook-cert\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.257607 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7zz\" (UniqueName: \"kubernetes.io/projected/e14405c7-e1c8-4713-b14b-58926d71206b-kube-api-access-7p7zz\") pod \"metallb-operator-webhook-server-5d95fdcc77-tsv28\" (UID: \"e14405c7-e1c8-4713-b14b-58926d71206b\") " pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.291276 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.545907 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" event={"ID":"cb80bc02-e6c8-453b-b63b-5a95783c2520","Type":"ContainerStarted","Data":"7bf9e9608852120380fb759006884abe9856abd686839409f1c97838aebb8b7c"} Sep 30 18:59:21 crc kubenswrapper[4747]: I0930 18:59:21.722679 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28"] Sep 30 18:59:21 crc kubenswrapper[4747]: W0930 18:59:21.734053 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode14405c7_e1c8_4713_b14b_58926d71206b.slice/crio-049e9f776946319cef2820aba63a94a0b0151748893795f31516a62a82bf7d70 WatchSource:0}: Error finding container 049e9f776946319cef2820aba63a94a0b0151748893795f31516a62a82bf7d70: Status 404 returned error can't find the container with id 049e9f776946319cef2820aba63a94a0b0151748893795f31516a62a82bf7d70 Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.417984 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.555115 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" event={"ID":"e14405c7-e1c8-4713-b14b-58926d71206b","Type":"ContainerStarted","Data":"049e9f776946319cef2820aba63a94a0b0151748893795f31516a62a82bf7d70"} Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.555341 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdqjj\" (UniqueName: \"kubernetes.io/projected/1f2adf77-5bdd-459b-8829-f61c0cc70793-kube-api-access-kdqjj\") pod \"1f2adf77-5bdd-459b-8829-f61c0cc70793\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.555444 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-catalog-content\") pod \"1f2adf77-5bdd-459b-8829-f61c0cc70793\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.555537 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-utilities\") pod \"1f2adf77-5bdd-459b-8829-f61c0cc70793\" (UID: \"1f2adf77-5bdd-459b-8829-f61c0cc70793\") " Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.557275 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-utilities" (OuterVolumeSpecName: "utilities") pod "1f2adf77-5bdd-459b-8829-f61c0cc70793" (UID: "1f2adf77-5bdd-459b-8829-f61c0cc70793"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.560550 4747 generic.go:334] "Generic (PLEG): container finished" podID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerID="8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a" exitCode=0 Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.560631 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6kbv" event={"ID":"1f2adf77-5bdd-459b-8829-f61c0cc70793","Type":"ContainerDied","Data":"8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a"} Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.560680 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6kbv" event={"ID":"1f2adf77-5bdd-459b-8829-f61c0cc70793","Type":"ContainerDied","Data":"8f2c0a5124e49e183f83cadb89e007206f63abde90fc9d247d4e28c49680115e"} Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.560719 4747 scope.go:117] "RemoveContainer" containerID="8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.561018 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6kbv" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.566429 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2adf77-5bdd-459b-8829-f61c0cc70793-kube-api-access-kdqjj" (OuterVolumeSpecName: "kube-api-access-kdqjj") pod "1f2adf77-5bdd-459b-8829-f61c0cc70793" (UID: "1f2adf77-5bdd-459b-8829-f61c0cc70793"). InnerVolumeSpecName "kube-api-access-kdqjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.624017 4747 scope.go:117] "RemoveContainer" containerID="8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.647604 4747 scope.go:117] "RemoveContainer" containerID="8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.653511 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f2adf77-5bdd-459b-8829-f61c0cc70793" (UID: "1f2adf77-5bdd-459b-8829-f61c0cc70793"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.657534 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.658005 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdqjj\" (UniqueName: \"kubernetes.io/projected/1f2adf77-5bdd-459b-8829-f61c0cc70793-kube-api-access-kdqjj\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.658025 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2adf77-5bdd-459b-8829-f61c0cc70793-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.672615 4747 scope.go:117] "RemoveContainer" containerID="8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a" Sep 30 18:59:22 crc kubenswrapper[4747]: E0930 18:59:22.673070 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a\": container with ID starting with 8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a not found: ID does not exist" containerID="8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.673138 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a"} err="failed to get container status \"8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a\": rpc error: code = NotFound desc = could not find container \"8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a\": container with ID starting with 8ef04eabbaa1b8e2eb4c9069a5dd2cd611e2082d0b373f1d8a1e39c87c6e0f2a not found: ID does not exist" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.673180 4747 scope.go:117] "RemoveContainer" containerID="8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836" Sep 30 18:59:22 crc kubenswrapper[4747]: E0930 18:59:22.673546 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836\": container with ID starting with 8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836 not found: ID does not exist" containerID="8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.673595 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836"} err="failed to get container status \"8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836\": rpc error: code = NotFound desc = could not find container \"8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836\": container with ID starting with 8abbf7eeb752018d1869357d655260799d131ae9c396afda931c527c9995d836 not found: ID does not exist" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.673621 4747 scope.go:117] "RemoveContainer" containerID="8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91" Sep 30 18:59:22 crc kubenswrapper[4747]: E0930 18:59:22.673883 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91\": container with ID starting with 8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91 not found: ID does not exist" containerID="8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.673954 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91"} err="failed to get container status \"8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91\": rpc error: code = NotFound desc = could not find container \"8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91\": container with ID starting with 8bf4380364a4231582ccdd85d1c0b84c2c7c813ad6be40d555258f7db1389b91 not found: ID does not exist" Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.892805 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6kbv"] Sep 30 18:59:22 crc kubenswrapper[4747]: I0930 18:59:22.892862 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b6kbv"] Sep 30 18:59:23 crc kubenswrapper[4747]: I0930 18:59:23.098431 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" path="/var/lib/kubelet/pods/1f2adf77-5bdd-459b-8829-f61c0cc70793/volumes" Sep 30 18:59:24 crc kubenswrapper[4747]: I0930 18:59:24.581704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" event={"ID":"cb80bc02-e6c8-453b-b63b-5a95783c2520","Type":"ContainerStarted","Data":"0bc603be41335cc07b089fa9e8eaa87430570c7705b1f60741a6366df5fb80ef"} Sep 30 18:59:24 crc kubenswrapper[4747]: I0930 18:59:24.582190 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 18:59:24 crc kubenswrapper[4747]: I0930 18:59:24.620425 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" podStartSLOduration=1.437941184 podStartE2EDuration="4.620391656s" podCreationTimestamp="2025-09-30 18:59:20 +0000 UTC" firstStartedPulling="2025-09-30 18:59:21.1607687 +0000 UTC m=+800.820248814" lastFinishedPulling="2025-09-30 18:59:24.343219172 +0000 UTC m=+804.002699286" observedRunningTime="2025-09-30 18:59:24.60835108 +0000 UTC m=+804.267831234" watchObservedRunningTime="2025-09-30 18:59:24.620391656 +0000 UTC m=+804.279871820" Sep 30 18:59:35 crc kubenswrapper[4747]: I0930 18:59:35.661342 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" event={"ID":"e14405c7-e1c8-4713-b14b-58926d71206b","Type":"ContainerStarted","Data":"9d67e84d5535301c1e3c7f0b54004de87a616c3780cd42a4a892fa87bcfbcc00"} Sep 30 18:59:35 crc kubenswrapper[4747]: I0930 18:59:35.662232 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:35 crc kubenswrapper[4747]: I0930 18:59:35.692323 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" podStartSLOduration=2.902482501 podStartE2EDuration="15.692305534s" podCreationTimestamp="2025-09-30 18:59:20 +0000 UTC" firstStartedPulling="2025-09-30 18:59:21.739204222 +0000 UTC m=+801.398684376" lastFinishedPulling="2025-09-30 18:59:34.529027285 +0000 UTC m=+814.188507409" observedRunningTime="2025-09-30 18:59:35.686766525 +0000 UTC m=+815.346246649" watchObservedRunningTime="2025-09-30 18:59:35.692305534 +0000 UTC m=+815.351785658" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.014417 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xcbp7"] Sep 30 18:59:41 crc kubenswrapper[4747]: E0930 18:59:41.015529 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerName="extract-content" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.015558 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerName="extract-content" Sep 30 18:59:41 crc kubenswrapper[4747]: E0930 18:59:41.015586 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerName="registry-server" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.015602 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerName="registry-server" Sep 30 18:59:41 crc kubenswrapper[4747]: E0930 18:59:41.015636 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerName="extract-utilities" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.015653 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerName="extract-utilities" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.015897 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2adf77-5bdd-459b-8829-f61c0cc70793" containerName="registry-server" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.017750 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.031808 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xcbp7"] Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.053732 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-catalog-content\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.053855 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh4h2\" (UniqueName: \"kubernetes.io/projected/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-kube-api-access-lh4h2\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.053904 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-utilities\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.155156 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh4h2\" (UniqueName: \"kubernetes.io/projected/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-kube-api-access-lh4h2\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.155465 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-utilities\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.155695 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-catalog-content\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.156119 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-utilities\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.156297 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-catalog-content\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.180508 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh4h2\" (UniqueName: \"kubernetes.io/projected/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-kube-api-access-lh4h2\") pod \"certified-operators-xcbp7\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.340445 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.656152 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xcbp7"] Sep 30 18:59:41 crc kubenswrapper[4747]: I0930 18:59:41.701652 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcbp7" event={"ID":"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1","Type":"ContainerStarted","Data":"574fd3ebf72f393d328a4626a2f7f8747ae76831a4e28335c23daea506941870"} Sep 30 18:59:42 crc kubenswrapper[4747]: I0930 18:59:42.712275 4747 generic.go:334] "Generic (PLEG): container finished" podID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerID="5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63" exitCode=0 Sep 30 18:59:42 crc kubenswrapper[4747]: I0930 18:59:42.712414 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcbp7" event={"ID":"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1","Type":"ContainerDied","Data":"5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63"} Sep 30 18:59:44 crc kubenswrapper[4747]: I0930 18:59:44.726744 4747 generic.go:334] "Generic (PLEG): container finished" podID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerID="c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090" exitCode=0 Sep 30 18:59:44 crc kubenswrapper[4747]: I0930 18:59:44.726852 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcbp7" event={"ID":"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1","Type":"ContainerDied","Data":"c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090"} Sep 30 18:59:45 crc kubenswrapper[4747]: I0930 18:59:45.740022 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcbp7" event={"ID":"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1","Type":"ContainerStarted","Data":"d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4"} Sep 30 18:59:45 crc kubenswrapper[4747]: I0930 18:59:45.771539 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xcbp7" podStartSLOduration=3.274301444 podStartE2EDuration="5.771522261s" podCreationTimestamp="2025-09-30 18:59:40 +0000 UTC" firstStartedPulling="2025-09-30 18:59:42.715080465 +0000 UTC m=+822.374560619" lastFinishedPulling="2025-09-30 18:59:45.212301302 +0000 UTC m=+824.871781436" observedRunningTime="2025-09-30 18:59:45.767784173 +0000 UTC m=+825.427264327" watchObservedRunningTime="2025-09-30 18:59:45.771522261 +0000 UTC m=+825.431002385" Sep 30 18:59:51 crc kubenswrapper[4747]: I0930 18:59:51.297435 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5d95fdcc77-tsv28" Sep 30 18:59:51 crc kubenswrapper[4747]: I0930 18:59:51.340631 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:51 crc kubenswrapper[4747]: I0930 18:59:51.340712 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:51 crc kubenswrapper[4747]: I0930 18:59:51.423717 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:51 crc kubenswrapper[4747]: I0930 18:59:51.859490 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:54 crc kubenswrapper[4747]: I0930 18:59:54.394187 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xcbp7"] Sep 30 18:59:54 crc kubenswrapper[4747]: I0930 18:59:54.394460 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xcbp7" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerName="registry-server" containerID="cri-o://d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4" gracePeriod=2 Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.498634 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.665995 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh4h2\" (UniqueName: \"kubernetes.io/projected/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-kube-api-access-lh4h2\") pod \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.666085 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-utilities\") pod \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.666138 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-catalog-content\") pod \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\" (UID: \"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1\") " Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.667580 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-utilities" (OuterVolumeSpecName: "utilities") pod "a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" (UID: "a5dc6c86-9e8d-461c-9c2e-44b21aa625f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.678711 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-kube-api-access-lh4h2" (OuterVolumeSpecName: "kube-api-access-lh4h2") pod "a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" (UID: "a5dc6c86-9e8d-461c-9c2e-44b21aa625f1"). InnerVolumeSpecName "kube-api-access-lh4h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.731116 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" (UID: "a5dc6c86-9e8d-461c-9c2e-44b21aa625f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.767496 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.767529 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh4h2\" (UniqueName: \"kubernetes.io/projected/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-kube-api-access-lh4h2\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.767543 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.809062 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5qmfc"] Sep 30 18:59:55 crc kubenswrapper[4747]: E0930 18:59:55.809442 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerName="registry-server" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.809472 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerName="registry-server" Sep 30 18:59:55 crc kubenswrapper[4747]: E0930 18:59:55.809497 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerName="extract-content" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.809512 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerName="extract-content" Sep 30 18:59:55 crc kubenswrapper[4747]: E0930 18:59:55.809554 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerName="extract-utilities" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.809569 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerName="extract-utilities" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.809765 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerName="registry-server" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.811100 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.823479 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qmfc"] Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.829173 4747 generic.go:334] "Generic (PLEG): container finished" podID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" containerID="d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4" exitCode=0 Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.829403 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcbp7" event={"ID":"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1","Type":"ContainerDied","Data":"d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4"} Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.829583 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcbp7" event={"ID":"a5dc6c86-9e8d-461c-9c2e-44b21aa625f1","Type":"ContainerDied","Data":"574fd3ebf72f393d328a4626a2f7f8747ae76831a4e28335c23daea506941870"} Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.829719 4747 scope.go:117] "RemoveContainer" containerID="d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.830100 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcbp7" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.859881 4747 scope.go:117] "RemoveContainer" containerID="c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.883205 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xcbp7"] Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.887607 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xcbp7"] Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.900267 4747 scope.go:117] "RemoveContainer" containerID="5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.925447 4747 scope.go:117] "RemoveContainer" containerID="d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4" Sep 30 18:59:55 crc kubenswrapper[4747]: E0930 18:59:55.926237 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4\": container with ID starting with d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4 not found: ID does not exist" containerID="d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.926297 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4"} err="failed to get container status \"d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4\": rpc error: code = NotFound desc = could not find container \"d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4\": container with ID starting with d1874d870d9e45f9ba3a2da8c6fb5d4b6424ff50970fd167d40c0a7f204624e4 not found: ID does not exist" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.926332 4747 scope.go:117] "RemoveContainer" containerID="c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090" Sep 30 18:59:55 crc kubenswrapper[4747]: E0930 18:59:55.926798 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090\": container with ID starting with c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090 not found: ID does not exist" containerID="c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.926903 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090"} err="failed to get container status \"c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090\": rpc error: code = NotFound desc = could not find container \"c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090\": container with ID starting with c825b4a4b7d53e0ccef8b93c32ff8da95feaba35b0ce0c787ec13ac9bb079090 not found: ID does not exist" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.927023 4747 scope.go:117] "RemoveContainer" containerID="5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63" Sep 30 18:59:55 crc kubenswrapper[4747]: E0930 18:59:55.927483 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63\": container with ID starting with 5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63 not found: ID does not exist" containerID="5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.927516 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63"} err="failed to get container status \"5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63\": rpc error: code = NotFound desc = could not find container \"5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63\": container with ID starting with 5cafa0680a47ff21a02442c7be4e00b2c304aa31a67525221ff09ba0e81a6d63 not found: ID does not exist" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.969507 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-utilities\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.969612 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-catalog-content\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:55 crc kubenswrapper[4747]: I0930 18:59:55.969653 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ft2d\" (UniqueName: \"kubernetes.io/projected/f103dfa1-cdb2-4574-9c0b-f31b113680ca-kube-api-access-4ft2d\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.072213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-catalog-content\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.072304 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ft2d\" (UniqueName: \"kubernetes.io/projected/f103dfa1-cdb2-4574-9c0b-f31b113680ca-kube-api-access-4ft2d\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.072375 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-utilities\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.072837 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-catalog-content\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.072952 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-utilities\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.091571 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ft2d\" (UniqueName: \"kubernetes.io/projected/f103dfa1-cdb2-4574-9c0b-f31b113680ca-kube-api-access-4ft2d\") pod \"redhat-marketplace-5qmfc\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.148207 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.413123 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qmfc"] Sep 30 18:59:56 crc kubenswrapper[4747]: W0930 18:59:56.423216 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf103dfa1_cdb2_4574_9c0b_f31b113680ca.slice/crio-987dfff793429dfc299e419b6036d95d989333e619431f3ce6fd3349b6d1bd58 WatchSource:0}: Error finding container 987dfff793429dfc299e419b6036d95d989333e619431f3ce6fd3349b6d1bd58: Status 404 returned error can't find the container with id 987dfff793429dfc299e419b6036d95d989333e619431f3ce6fd3349b6d1bd58 Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.841071 4747 generic.go:334] "Generic (PLEG): container finished" podID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerID="bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72" exitCode=0 Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.841264 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qmfc" event={"ID":"f103dfa1-cdb2-4574-9c0b-f31b113680ca","Type":"ContainerDied","Data":"bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72"} Sep 30 18:59:56 crc kubenswrapper[4747]: I0930 18:59:56.841647 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qmfc" event={"ID":"f103dfa1-cdb2-4574-9c0b-f31b113680ca","Type":"ContainerStarted","Data":"987dfff793429dfc299e419b6036d95d989333e619431f3ce6fd3349b6d1bd58"} Sep 30 18:59:57 crc kubenswrapper[4747]: I0930 18:59:57.097817 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5dc6c86-9e8d-461c-9c2e-44b21aa625f1" path="/var/lib/kubelet/pods/a5dc6c86-9e8d-461c-9c2e-44b21aa625f1/volumes" Sep 30 18:59:57 crc kubenswrapper[4747]: I0930 18:59:57.857416 4747 generic.go:334] "Generic (PLEG): container finished" podID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerID="e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307" exitCode=0 Sep 30 18:59:57 crc kubenswrapper[4747]: I0930 18:59:57.857483 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qmfc" event={"ID":"f103dfa1-cdb2-4574-9c0b-f31b113680ca","Type":"ContainerDied","Data":"e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307"} Sep 30 18:59:58 crc kubenswrapper[4747]: I0930 18:59:58.883897 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qmfc" event={"ID":"f103dfa1-cdb2-4574-9c0b-f31b113680ca","Type":"ContainerStarted","Data":"a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a"} Sep 30 18:59:58 crc kubenswrapper[4747]: I0930 18:59:58.908783 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5qmfc" podStartSLOduration=2.446834319 podStartE2EDuration="3.908762821s" podCreationTimestamp="2025-09-30 18:59:55 +0000 UTC" firstStartedPulling="2025-09-30 18:59:56.843788989 +0000 UTC m=+836.503269133" lastFinishedPulling="2025-09-30 18:59:58.305717521 +0000 UTC m=+837.965197635" observedRunningTime="2025-09-30 18:59:58.905532808 +0000 UTC m=+838.565012932" watchObservedRunningTime="2025-09-30 18:59:58.908762821 +0000 UTC m=+838.568242955" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.151493 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d"] Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.152628 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.155156 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.155620 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.160685 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d"] Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.336913 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rtdz\" (UniqueName: \"kubernetes.io/projected/6ea21876-713a-47d1-a172-11cf3b7fcb34-kube-api-access-7rtdz\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.337178 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ea21876-713a-47d1-a172-11cf3b7fcb34-config-volume\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.337245 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ea21876-713a-47d1-a172-11cf3b7fcb34-secret-volume\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.438640 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rtdz\" (UniqueName: \"kubernetes.io/projected/6ea21876-713a-47d1-a172-11cf3b7fcb34-kube-api-access-7rtdz\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.438756 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ea21876-713a-47d1-a172-11cf3b7fcb34-config-volume\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.438796 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ea21876-713a-47d1-a172-11cf3b7fcb34-secret-volume\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.440611 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ea21876-713a-47d1-a172-11cf3b7fcb34-config-volume\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.456565 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ea21876-713a-47d1-a172-11cf3b7fcb34-secret-volume\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.469550 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rtdz\" (UniqueName: \"kubernetes.io/projected/6ea21876-713a-47d1-a172-11cf3b7fcb34-kube-api-access-7rtdz\") pod \"collect-profiles-29320980-llg9d\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.495016 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.751893 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d"] Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.905606 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-d99c7c5b9-tzfsr" Sep 30 19:00:00 crc kubenswrapper[4747]: I0930 19:00:00.919251 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" event={"ID":"6ea21876-713a-47d1-a172-11cf3b7fcb34","Type":"ContainerStarted","Data":"ef387518a8b57e07b4b7152e372d09ebf83ce6bedc00dcd7a6737fd88ae06e10"} Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.602644 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lk65v"] Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.610078 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8"] Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.610673 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.611146 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.617705 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8"] Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.621341 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.621632 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.621784 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-z6jsn" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.621912 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.700607 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-hgt8c"] Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.701456 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.705222 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.705260 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.705295 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.705350 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zd4sn" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.733090 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-kxnbc"] Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.733892 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.735468 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.749119 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-kxnbc"] Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.754948 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f190785a-8ea7-4e42-b803-192a77b4c874-frr-startup\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.754988 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9mpb\" (UniqueName: \"kubernetes.io/projected/f190785a-8ea7-4e42-b803-192a77b4c874-kube-api-access-z9mpb\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.755009 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-metrics\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.755040 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-reloader\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.755056 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-frr-sockets\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.755075 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-frr-conf\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.755107 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5dmn\" (UniqueName: \"kubernetes.io/projected/6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa-kube-api-access-l5dmn\") pod \"frr-k8s-webhook-server-5478bdb765-swnb8\" (UID: \"6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.755128 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa-cert\") pod \"frr-k8s-webhook-server-5478bdb765-swnb8\" (UID: \"6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.755143 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f190785a-8ea7-4e42-b803-192a77b4c874-metrics-certs\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856193 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5dmn\" (UniqueName: \"kubernetes.io/projected/6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa-kube-api-access-l5dmn\") pod \"frr-k8s-webhook-server-5478bdb765-swnb8\" (UID: \"6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856256 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftdh\" (UniqueName: \"kubernetes.io/projected/78df1507-7948-4a08-a2be-1b1a60cbb9ff-kube-api-access-xftdh\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa-cert\") pod \"frr-k8s-webhook-server-5478bdb765-swnb8\" (UID: \"6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f190785a-8ea7-4e42-b803-192a77b4c874-metrics-certs\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856369 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-memberlist\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856389 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78df1507-7948-4a08-a2be-1b1a60cbb9ff-metrics-certs\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856416 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-metrics-certs\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856444 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f190785a-8ea7-4e42-b803-192a77b4c874-frr-startup\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856467 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wslw2\" (UniqueName: \"kubernetes.io/projected/f5763869-994d-4a75-a8a9-00bee3414aad-kube-api-access-wslw2\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9mpb\" (UniqueName: \"kubernetes.io/projected/f190785a-8ea7-4e42-b803-192a77b4c874-kube-api-access-z9mpb\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856556 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-metrics\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: E0930 19:00:01.856527 4747 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856582 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78df1507-7948-4a08-a2be-1b1a60cbb9ff-cert\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: E0930 19:00:01.856674 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f190785a-8ea7-4e42-b803-192a77b4c874-metrics-certs podName:f190785a-8ea7-4e42-b803-192a77b4c874 nodeName:}" failed. No retries permitted until 2025-09-30 19:00:02.356641403 +0000 UTC m=+842.016121587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f190785a-8ea7-4e42-b803-192a77b4c874-metrics-certs") pod "frr-k8s-lk65v" (UID: "f190785a-8ea7-4e42-b803-192a77b4c874") : secret "frr-k8s-certs-secret" not found Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856694 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f5763869-994d-4a75-a8a9-00bee3414aad-metallb-excludel2\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856748 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-reloader\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.856811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-frr-sockets\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.857111 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-frr-conf\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.857008 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-metrics\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.857142 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-frr-sockets\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.857065 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-reloader\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.857370 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f190785a-8ea7-4e42-b803-192a77b4c874-frr-conf\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.857560 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f190785a-8ea7-4e42-b803-192a77b4c874-frr-startup\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.863955 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa-cert\") pod \"frr-k8s-webhook-server-5478bdb765-swnb8\" (UID: \"6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.873447 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5dmn\" (UniqueName: \"kubernetes.io/projected/6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa-kube-api-access-l5dmn\") pod \"frr-k8s-webhook-server-5478bdb765-swnb8\" (UID: \"6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.874512 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9mpb\" (UniqueName: \"kubernetes.io/projected/f190785a-8ea7-4e42-b803-192a77b4c874-kube-api-access-z9mpb\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.925172 4747 generic.go:334] "Generic (PLEG): container finished" podID="6ea21876-713a-47d1-a172-11cf3b7fcb34" containerID="349ffddd3b25f0176a1a407eea84944d07329675e673d6c74bac95cbcdb094c8" exitCode=0 Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.925205 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" event={"ID":"6ea21876-713a-47d1-a172-11cf3b7fcb34","Type":"ContainerDied","Data":"349ffddd3b25f0176a1a407eea84944d07329675e673d6c74bac95cbcdb094c8"} Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.935982 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.958053 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftdh\" (UniqueName: \"kubernetes.io/projected/78df1507-7948-4a08-a2be-1b1a60cbb9ff-kube-api-access-xftdh\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.958389 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-memberlist\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.958414 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78df1507-7948-4a08-a2be-1b1a60cbb9ff-metrics-certs\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.958436 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-metrics-certs\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.958465 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wslw2\" (UniqueName: \"kubernetes.io/projected/f5763869-994d-4a75-a8a9-00bee3414aad-kube-api-access-wslw2\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.958502 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78df1507-7948-4a08-a2be-1b1a60cbb9ff-cert\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.958529 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f5763869-994d-4a75-a8a9-00bee3414aad-metallb-excludel2\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: E0930 19:00:01.958563 4747 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 30 19:00:01 crc kubenswrapper[4747]: E0930 19:00:01.958645 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-metrics-certs podName:f5763869-994d-4a75-a8a9-00bee3414aad nodeName:}" failed. No retries permitted until 2025-09-30 19:00:02.458620356 +0000 UTC m=+842.118100540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-metrics-certs") pod "speaker-hgt8c" (UID: "f5763869-994d-4a75-a8a9-00bee3414aad") : secret "speaker-certs-secret" not found Sep 30 19:00:01 crc kubenswrapper[4747]: E0930 19:00:01.958757 4747 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 19:00:01 crc kubenswrapper[4747]: E0930 19:00:01.958796 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-memberlist podName:f5763869-994d-4a75-a8a9-00bee3414aad nodeName:}" failed. No retries permitted until 2025-09-30 19:00:02.458784051 +0000 UTC m=+842.118264245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-memberlist") pod "speaker-hgt8c" (UID: "f5763869-994d-4a75-a8a9-00bee3414aad") : secret "metallb-memberlist" not found Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.959341 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f5763869-994d-4a75-a8a9-00bee3414aad-metallb-excludel2\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.960918 4747 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.963381 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78df1507-7948-4a08-a2be-1b1a60cbb9ff-metrics-certs\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.972564 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78df1507-7948-4a08-a2be-1b1a60cbb9ff-cert\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.973689 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftdh\" (UniqueName: \"kubernetes.io/projected/78df1507-7948-4a08-a2be-1b1a60cbb9ff-kube-api-access-xftdh\") pod \"controller-5d688f5ffc-kxnbc\" (UID: \"78df1507-7948-4a08-a2be-1b1a60cbb9ff\") " pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:01 crc kubenswrapper[4747]: I0930 19:00:01.981601 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wslw2\" (UniqueName: \"kubernetes.io/projected/f5763869-994d-4a75-a8a9-00bee3414aad-kube-api-access-wslw2\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.047259 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.129599 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8"] Sep 30 19:00:02 crc kubenswrapper[4747]: W0930 19:00:02.140155 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6af1f61d_d5a1_4a3a_94ef_08d731e3b1aa.slice/crio-f02073a11c6ab999d36f61e5229adb74e422da0eed178adee9724f86b78856a4 WatchSource:0}: Error finding container f02073a11c6ab999d36f61e5229adb74e422da0eed178adee9724f86b78856a4: Status 404 returned error can't find the container with id f02073a11c6ab999d36f61e5229adb74e422da0eed178adee9724f86b78856a4 Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.229283 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-kxnbc"] Sep 30 19:00:02 crc kubenswrapper[4747]: W0930 19:00:02.237154 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78df1507_7948_4a08_a2be_1b1a60cbb9ff.slice/crio-ac18909861e7f36666ac93e4a2db3496b170d322c39714c6a1a6e88afd6a721e WatchSource:0}: Error finding container ac18909861e7f36666ac93e4a2db3496b170d322c39714c6a1a6e88afd6a721e: Status 404 returned error can't find the container with id ac18909861e7f36666ac93e4a2db3496b170d322c39714c6a1a6e88afd6a721e Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.363451 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f190785a-8ea7-4e42-b803-192a77b4c874-metrics-certs\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.371283 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f190785a-8ea7-4e42-b803-192a77b4c874-metrics-certs\") pod \"frr-k8s-lk65v\" (UID: \"f190785a-8ea7-4e42-b803-192a77b4c874\") " pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.465588 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-memberlist\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.465646 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-metrics-certs\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.474464 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-metrics-certs\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.474599 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f5763869-994d-4a75-a8a9-00bee3414aad-memberlist\") pod \"speaker-hgt8c\" (UID: \"f5763869-994d-4a75-a8a9-00bee3414aad\") " pod="metallb-system/speaker-hgt8c" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.551146 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.620685 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hgt8c" Sep 30 19:00:02 crc kubenswrapper[4747]: W0930 19:00:02.640157 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5763869_994d_4a75_a8a9_00bee3414aad.slice/crio-2a489db7244c55034a7e16bd2e6d461bef7a94ffaddc0ae41368c0e4cde98d79 WatchSource:0}: Error finding container 2a489db7244c55034a7e16bd2e6d461bef7a94ffaddc0ae41368c0e4cde98d79: Status 404 returned error can't find the container with id 2a489db7244c55034a7e16bd2e6d461bef7a94ffaddc0ae41368c0e4cde98d79 Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.931622 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" event={"ID":"6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa","Type":"ContainerStarted","Data":"f02073a11c6ab999d36f61e5229adb74e422da0eed178adee9724f86b78856a4"} Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.933319 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hgt8c" event={"ID":"f5763869-994d-4a75-a8a9-00bee3414aad","Type":"ContainerStarted","Data":"68284420bc5bedbf0bb60142216586decec366bf148149a78a0b2721f75dfb2f"} Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.933346 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hgt8c" event={"ID":"f5763869-994d-4a75-a8a9-00bee3414aad","Type":"ContainerStarted","Data":"2a489db7244c55034a7e16bd2e6d461bef7a94ffaddc0ae41368c0e4cde98d79"} Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.934464 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerStarted","Data":"12bc0155c11461a079f204d438db1cf2ea18b8accad54e8214d5ce3fe1c60228"} Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.936195 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-kxnbc" event={"ID":"78df1507-7948-4a08-a2be-1b1a60cbb9ff","Type":"ContainerStarted","Data":"d431b0d6167540f6ce08a1a84ea9ece40aa6e19b43a42b3aba39669f7f7f18f0"} Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.936314 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-kxnbc" event={"ID":"78df1507-7948-4a08-a2be-1b1a60cbb9ff","Type":"ContainerStarted","Data":"e56dbdeec2b11a34fe180f9584e75cbc0eb5ce3770c511906a4883e4074933f6"} Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.936416 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-kxnbc" event={"ID":"78df1507-7948-4a08-a2be-1b1a60cbb9ff","Type":"ContainerStarted","Data":"ac18909861e7f36666ac93e4a2db3496b170d322c39714c6a1a6e88afd6a721e"} Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.936506 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:02 crc kubenswrapper[4747]: I0930 19:00:02.960668 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-kxnbc" podStartSLOduration=1.960638084 podStartE2EDuration="1.960638084s" podCreationTimestamp="2025-09-30 19:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:00:02.955666143 +0000 UTC m=+842.615146257" watchObservedRunningTime="2025-09-30 19:00:02.960638084 +0000 UTC m=+842.620118218" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.226730 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.292801 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ea21876-713a-47d1-a172-11cf3b7fcb34-secret-volume\") pod \"6ea21876-713a-47d1-a172-11cf3b7fcb34\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.292896 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rtdz\" (UniqueName: \"kubernetes.io/projected/6ea21876-713a-47d1-a172-11cf3b7fcb34-kube-api-access-7rtdz\") pod \"6ea21876-713a-47d1-a172-11cf3b7fcb34\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.292969 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ea21876-713a-47d1-a172-11cf3b7fcb34-config-volume\") pod \"6ea21876-713a-47d1-a172-11cf3b7fcb34\" (UID: \"6ea21876-713a-47d1-a172-11cf3b7fcb34\") " Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.294214 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea21876-713a-47d1-a172-11cf3b7fcb34-config-volume" (OuterVolumeSpecName: "config-volume") pod "6ea21876-713a-47d1-a172-11cf3b7fcb34" (UID: "6ea21876-713a-47d1-a172-11cf3b7fcb34"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.298616 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea21876-713a-47d1-a172-11cf3b7fcb34-kube-api-access-7rtdz" (OuterVolumeSpecName: "kube-api-access-7rtdz") pod "6ea21876-713a-47d1-a172-11cf3b7fcb34" (UID: "6ea21876-713a-47d1-a172-11cf3b7fcb34"). InnerVolumeSpecName "kube-api-access-7rtdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.300468 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea21876-713a-47d1-a172-11cf3b7fcb34-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6ea21876-713a-47d1-a172-11cf3b7fcb34" (UID: "6ea21876-713a-47d1-a172-11cf3b7fcb34"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.394688 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ea21876-713a-47d1-a172-11cf3b7fcb34-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.394727 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rtdz\" (UniqueName: \"kubernetes.io/projected/6ea21876-713a-47d1-a172-11cf3b7fcb34-kube-api-access-7rtdz\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.394739 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ea21876-713a-47d1-a172-11cf3b7fcb34-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.945606 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hgt8c" event={"ID":"f5763869-994d-4a75-a8a9-00bee3414aad","Type":"ContainerStarted","Data":"c3e4d1c4301e1ad99f4a1f9ca5d2037a6a4680d72fbdba36adc981e7087d376d"} Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.945680 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-hgt8c" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.951591 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.951593 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320980-llg9d" event={"ID":"6ea21876-713a-47d1-a172-11cf3b7fcb34","Type":"ContainerDied","Data":"ef387518a8b57e07b4b7152e372d09ebf83ce6bedc00dcd7a6737fd88ae06e10"} Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.951644 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef387518a8b57e07b4b7152e372d09ebf83ce6bedc00dcd7a6737fd88ae06e10" Sep 30 19:00:03 crc kubenswrapper[4747]: I0930 19:00:03.963535 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-hgt8c" podStartSLOduration=2.963515718 podStartE2EDuration="2.963515718s" podCreationTimestamp="2025-09-30 19:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:00:03.959532815 +0000 UTC m=+843.619012939" watchObservedRunningTime="2025-09-30 19:00:03.963515718 +0000 UTC m=+843.622995832" Sep 30 19:00:06 crc kubenswrapper[4747]: I0930 19:00:06.148380 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 19:00:06 crc kubenswrapper[4747]: I0930 19:00:06.149066 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 19:00:06 crc kubenswrapper[4747]: I0930 19:00:06.193238 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 19:00:07 crc kubenswrapper[4747]: I0930 19:00:07.049175 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 19:00:08 crc kubenswrapper[4747]: I0930 19:00:08.595551 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qmfc"] Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.000176 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5qmfc" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerName="registry-server" containerID="cri-o://a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a" gracePeriod=2 Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.686800 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.717954 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ft2d\" (UniqueName: \"kubernetes.io/projected/f103dfa1-cdb2-4574-9c0b-f31b113680ca-kube-api-access-4ft2d\") pod \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.718510 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-catalog-content\") pod \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.718626 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-utilities\") pod \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\" (UID: \"f103dfa1-cdb2-4574-9c0b-f31b113680ca\") " Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.720360 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-utilities" (OuterVolumeSpecName: "utilities") pod "f103dfa1-cdb2-4574-9c0b-f31b113680ca" (UID: "f103dfa1-cdb2-4574-9c0b-f31b113680ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.738771 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f103dfa1-cdb2-4574-9c0b-f31b113680ca" (UID: "f103dfa1-cdb2-4574-9c0b-f31b113680ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.751920 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f103dfa1-cdb2-4574-9c0b-f31b113680ca-kube-api-access-4ft2d" (OuterVolumeSpecName: "kube-api-access-4ft2d") pod "f103dfa1-cdb2-4574-9c0b-f31b113680ca" (UID: "f103dfa1-cdb2-4574-9c0b-f31b113680ca"). InnerVolumeSpecName "kube-api-access-4ft2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.820731 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.820792 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f103dfa1-cdb2-4574-9c0b-f31b113680ca-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:09 crc kubenswrapper[4747]: I0930 19:00:09.820805 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ft2d\" (UniqueName: \"kubernetes.io/projected/f103dfa1-cdb2-4574-9c0b-f31b113680ca-kube-api-access-4ft2d\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.011577 4747 generic.go:334] "Generic (PLEG): container finished" podID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerID="a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a" exitCode=0 Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.011666 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qmfc" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.011666 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qmfc" event={"ID":"f103dfa1-cdb2-4574-9c0b-f31b113680ca","Type":"ContainerDied","Data":"a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a"} Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.011770 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qmfc" event={"ID":"f103dfa1-cdb2-4574-9c0b-f31b113680ca","Type":"ContainerDied","Data":"987dfff793429dfc299e419b6036d95d989333e619431f3ce6fd3349b6d1bd58"} Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.011827 4747 scope.go:117] "RemoveContainer" containerID="a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.015085 4747 generic.go:334] "Generic (PLEG): container finished" podID="f190785a-8ea7-4e42-b803-192a77b4c874" containerID="bd16d3b8b6dde87c1b8a17a330a6a61844e4bfe39a7abd568719266f983e3fe5" exitCode=0 Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.015195 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerDied","Data":"bd16d3b8b6dde87c1b8a17a330a6a61844e4bfe39a7abd568719266f983e3fe5"} Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.017768 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" event={"ID":"6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa","Type":"ContainerStarted","Data":"3d81a1bf4456775cfd59d826f446c30e48f5fd692f463ffe7debf4be8fba1ca7"} Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.017947 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.038650 4747 scope.go:117] "RemoveContainer" containerID="e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.054148 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" podStartSLOduration=1.67598965 podStartE2EDuration="9.054120196s" podCreationTimestamp="2025-09-30 19:00:01 +0000 UTC" firstStartedPulling="2025-09-30 19:00:02.142422119 +0000 UTC m=+841.801902233" lastFinishedPulling="2025-09-30 19:00:09.520552675 +0000 UTC m=+849.180032779" observedRunningTime="2025-09-30 19:00:10.047206429 +0000 UTC m=+849.706686593" watchObservedRunningTime="2025-09-30 19:00:10.054120196 +0000 UTC m=+849.713600350" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.092165 4747 scope.go:117] "RemoveContainer" containerID="bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.106490 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qmfc"] Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.111300 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qmfc"] Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.116480 4747 scope.go:117] "RemoveContainer" containerID="a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a" Sep 30 19:00:10 crc kubenswrapper[4747]: E0930 19:00:10.117009 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a\": container with ID starting with a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a not found: ID does not exist" containerID="a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.117061 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a"} err="failed to get container status \"a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a\": rpc error: code = NotFound desc = could not find container \"a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a\": container with ID starting with a8468a3348c42e0fab052020a815fcbc016b4c830cb9890e625f70a97ca9612a not found: ID does not exist" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.117093 4747 scope.go:117] "RemoveContainer" containerID="e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307" Sep 30 19:00:10 crc kubenswrapper[4747]: E0930 19:00:10.117438 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307\": container with ID starting with e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307 not found: ID does not exist" containerID="e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.117490 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307"} err="failed to get container status \"e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307\": rpc error: code = NotFound desc = could not find container \"e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307\": container with ID starting with e6f843706b475d91a429b3699da8535e5d6bc727b38ed7fc15ff68939606d307 not found: ID does not exist" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.117627 4747 scope.go:117] "RemoveContainer" containerID="bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72" Sep 30 19:00:10 crc kubenswrapper[4747]: E0930 19:00:10.118757 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72\": container with ID starting with bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72 not found: ID does not exist" containerID="bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72" Sep 30 19:00:10 crc kubenswrapper[4747]: I0930 19:00:10.118789 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72"} err="failed to get container status \"bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72\": rpc error: code = NotFound desc = could not find container \"bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72\": container with ID starting with bc1c26d070eba3b067f1cbf7d42caa41eafd25e43f656dbc63c2012b8cccba72 not found: ID does not exist" Sep 30 19:00:11 crc kubenswrapper[4747]: I0930 19:00:11.047197 4747 generic.go:334] "Generic (PLEG): container finished" podID="f190785a-8ea7-4e42-b803-192a77b4c874" containerID="ceb97f87af137049c952e6055c061dea6cd699f8fdaf94af5599322d4c36753d" exitCode=0 Sep 30 19:00:11 crc kubenswrapper[4747]: I0930 19:00:11.047313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerDied","Data":"ceb97f87af137049c952e6055c061dea6cd699f8fdaf94af5599322d4c36753d"} Sep 30 19:00:11 crc kubenswrapper[4747]: I0930 19:00:11.119769 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" path="/var/lib/kubelet/pods/f103dfa1-cdb2-4574-9c0b-f31b113680ca/volumes" Sep 30 19:00:12 crc kubenswrapper[4747]: I0930 19:00:12.055638 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-kxnbc" Sep 30 19:00:12 crc kubenswrapper[4747]: I0930 19:00:12.062913 4747 generic.go:334] "Generic (PLEG): container finished" podID="f190785a-8ea7-4e42-b803-192a77b4c874" containerID="2a16cf5de1630c97dc5f7e0d95811e7528aa7caf1d6109eb114515a50b781602" exitCode=0 Sep 30 19:00:12 crc kubenswrapper[4747]: I0930 19:00:12.062988 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerDied","Data":"2a16cf5de1630c97dc5f7e0d95811e7528aa7caf1d6109eb114515a50b781602"} Sep 30 19:00:12 crc kubenswrapper[4747]: I0930 19:00:12.624026 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-hgt8c" Sep 30 19:00:13 crc kubenswrapper[4747]: I0930 19:00:13.079172 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerStarted","Data":"e9ab7af73d378be3ddf8b95ee03409404d20069a5b15ceed8228d17c0dacf9ba"} Sep 30 19:00:13 crc kubenswrapper[4747]: I0930 19:00:13.080695 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerStarted","Data":"68a44f0242553a30ec2c267a7704e1a3c8acd0f442cf5c7a7021c751a7287e5d"} Sep 30 19:00:13 crc kubenswrapper[4747]: I0930 19:00:13.080800 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerStarted","Data":"457a7de1f57c2ecdbff86d0f52c23670779237a080efa0e7783f10c79653722c"} Sep 30 19:00:13 crc kubenswrapper[4747]: I0930 19:00:13.080895 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerStarted","Data":"4962b39631215a10b8c403a94ada7c837fd9ac207e7ab79b9319729a0d81014c"} Sep 30 19:00:13 crc kubenswrapper[4747]: I0930 19:00:13.080997 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerStarted","Data":"4ea6c53912c3b4bc5d7759cbaf426e2cc964944703276f1e42d844f01d2b6a74"} Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.090476 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lk65v" event={"ID":"f190785a-8ea7-4e42-b803-192a77b4c874","Type":"ContainerStarted","Data":"afda7f3a8bec05ef453caa59d722bb5b7d1f73af4ce41c5af5bb12da3d902977"} Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.090789 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.130742 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lk65v" podStartSLOduration=6.278509101 podStartE2EDuration="13.130716622s" podCreationTimestamp="2025-09-30 19:00:01 +0000 UTC" firstStartedPulling="2025-09-30 19:00:02.706211842 +0000 UTC m=+842.365691946" lastFinishedPulling="2025-09-30 19:00:09.558419353 +0000 UTC m=+849.217899467" observedRunningTime="2025-09-30 19:00:14.124579348 +0000 UTC m=+853.784059502" watchObservedRunningTime="2025-09-30 19:00:14.130716622 +0000 UTC m=+853.790196776" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.459751 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk"] Sep 30 19:00:14 crc kubenswrapper[4747]: E0930 19:00:14.460133 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerName="extract-content" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.460164 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerName="extract-content" Sep 30 19:00:14 crc kubenswrapper[4747]: E0930 19:00:14.460185 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerName="registry-server" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.460198 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerName="registry-server" Sep 30 19:00:14 crc kubenswrapper[4747]: E0930 19:00:14.460219 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea21876-713a-47d1-a172-11cf3b7fcb34" containerName="collect-profiles" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.460231 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea21876-713a-47d1-a172-11cf3b7fcb34" containerName="collect-profiles" Sep 30 19:00:14 crc kubenswrapper[4747]: E0930 19:00:14.460260 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerName="extract-utilities" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.460271 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerName="extract-utilities" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.460429 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f103dfa1-cdb2-4574-9c0b-f31b113680ca" containerName="registry-server" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.460447 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea21876-713a-47d1-a172-11cf3b7fcb34" containerName="collect-profiles" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.461521 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.468312 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.471568 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk"] Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.490207 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.490255 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.490274 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hsl2\" (UniqueName: \"kubernetes.io/projected/238ba769-d3c7-4101-931c-f752b3343092-kube-api-access-4hsl2\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.591474 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.591855 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.592131 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hsl2\" (UniqueName: \"kubernetes.io/projected/238ba769-d3c7-4101-931c-f752b3343092-kube-api-access-4hsl2\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.592406 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.592432 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.617137 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hsl2\" (UniqueName: \"kubernetes.io/projected/238ba769-d3c7-4101-931c-f752b3343092-kube-api-access-4hsl2\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:14 crc kubenswrapper[4747]: I0930 19:00:14.798326 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:15 crc kubenswrapper[4747]: I0930 19:00:15.239340 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk"] Sep 30 19:00:15 crc kubenswrapper[4747]: W0930 19:00:15.267097 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod238ba769_d3c7_4101_931c_f752b3343092.slice/crio-b694643b6cb26698dda0ec2dacb36328b89e5001334a0729c695b312c0e71a0a WatchSource:0}: Error finding container b694643b6cb26698dda0ec2dacb36328b89e5001334a0729c695b312c0e71a0a: Status 404 returned error can't find the container with id b694643b6cb26698dda0ec2dacb36328b89e5001334a0729c695b312c0e71a0a Sep 30 19:00:16 crc kubenswrapper[4747]: I0930 19:00:16.109179 4747 generic.go:334] "Generic (PLEG): container finished" podID="238ba769-d3c7-4101-931c-f752b3343092" containerID="d3229e9c529e56a2df741c5308486d39a2a79f8bfb831e1b7ef3c6c404d11253" exitCode=0 Sep 30 19:00:16 crc kubenswrapper[4747]: I0930 19:00:16.109267 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" event={"ID":"238ba769-d3c7-4101-931c-f752b3343092","Type":"ContainerDied","Data":"d3229e9c529e56a2df741c5308486d39a2a79f8bfb831e1b7ef3c6c404d11253"} Sep 30 19:00:16 crc kubenswrapper[4747]: I0930 19:00:16.109636 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" event={"ID":"238ba769-d3c7-4101-931c-f752b3343092","Type":"ContainerStarted","Data":"b694643b6cb26698dda0ec2dacb36328b89e5001334a0729c695b312c0e71a0a"} Sep 30 19:00:17 crc kubenswrapper[4747]: I0930 19:00:17.552980 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:17 crc kubenswrapper[4747]: I0930 19:00:17.589263 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:20 crc kubenswrapper[4747]: I0930 19:00:20.144483 4747 generic.go:334] "Generic (PLEG): container finished" podID="238ba769-d3c7-4101-931c-f752b3343092" containerID="c40c40da50cd4f8e45e8af2e58ccb2af0d16125befa3b8fb4251653390a1b3fe" exitCode=0 Sep 30 19:00:20 crc kubenswrapper[4747]: I0930 19:00:20.144535 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" event={"ID":"238ba769-d3c7-4101-931c-f752b3343092","Type":"ContainerDied","Data":"c40c40da50cd4f8e45e8af2e58ccb2af0d16125befa3b8fb4251653390a1b3fe"} Sep 30 19:00:21 crc kubenswrapper[4747]: I0930 19:00:21.943970 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-swnb8" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.162775 4747 generic.go:334] "Generic (PLEG): container finished" podID="238ba769-d3c7-4101-931c-f752b3343092" containerID="f93ad828af25c12d3ccdfa2ce936b53c38afe18ce0ae5cd84a9fbaa090a24df5" exitCode=0 Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.162827 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" event={"ID":"238ba769-d3c7-4101-931c-f752b3343092","Type":"ContainerDied","Data":"f93ad828af25c12d3ccdfa2ce936b53c38afe18ce0ae5cd84a9fbaa090a24df5"} Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.411146 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zs2dg"] Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.413546 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.415413 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-utilities\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.415585 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-catalog-content\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.415665 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9w2j\" (UniqueName: \"kubernetes.io/projected/bb294eab-16c5-4037-ba26-6dae9c8d2558-kube-api-access-z9w2j\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.452516 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zs2dg"] Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.517216 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-catalog-content\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.517363 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9w2j\" (UniqueName: \"kubernetes.io/projected/bb294eab-16c5-4037-ba26-6dae9c8d2558-kube-api-access-z9w2j\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.517458 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-utilities\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.518362 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-utilities\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.518901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-catalog-content\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.539128 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9w2j\" (UniqueName: \"kubernetes.io/projected/bb294eab-16c5-4037-ba26-6dae9c8d2558-kube-api-access-z9w2j\") pod \"community-operators-zs2dg\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.559018 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lk65v" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.745229 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:22 crc kubenswrapper[4747]: I0930 19:00:22.986394 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zs2dg"] Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.168900 4747 generic.go:334] "Generic (PLEG): container finished" podID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerID="6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a" exitCode=0 Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.168980 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zs2dg" event={"ID":"bb294eab-16c5-4037-ba26-6dae9c8d2558","Type":"ContainerDied","Data":"6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a"} Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.169053 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zs2dg" event={"ID":"bb294eab-16c5-4037-ba26-6dae9c8d2558","Type":"ContainerStarted","Data":"f18de288ad98ba04b3488f2fc95b0d3a0457ca85437448b921703ad4330f3003"} Sep 30 19:00:23 crc kubenswrapper[4747]: E0930 19:00:23.227227 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb294eab_16c5_4037_ba26_6dae9c8d2558.slice/crio-6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb294eab_16c5_4037_ba26_6dae9c8d2558.slice/crio-conmon-6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a.scope\": RecentStats: unable to find data in memory cache]" Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.420437 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.528575 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hsl2\" (UniqueName: \"kubernetes.io/projected/238ba769-d3c7-4101-931c-f752b3343092-kube-api-access-4hsl2\") pod \"238ba769-d3c7-4101-931c-f752b3343092\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.528672 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-bundle\") pod \"238ba769-d3c7-4101-931c-f752b3343092\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.528708 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-util\") pod \"238ba769-d3c7-4101-931c-f752b3343092\" (UID: \"238ba769-d3c7-4101-931c-f752b3343092\") " Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.530023 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-bundle" (OuterVolumeSpecName: "bundle") pod "238ba769-d3c7-4101-931c-f752b3343092" (UID: "238ba769-d3c7-4101-931c-f752b3343092"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.537946 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/238ba769-d3c7-4101-931c-f752b3343092-kube-api-access-4hsl2" (OuterVolumeSpecName: "kube-api-access-4hsl2") pod "238ba769-d3c7-4101-931c-f752b3343092" (UID: "238ba769-d3c7-4101-931c-f752b3343092"). InnerVolumeSpecName "kube-api-access-4hsl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.542756 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-util" (OuterVolumeSpecName: "util") pod "238ba769-d3c7-4101-931c-f752b3343092" (UID: "238ba769-d3c7-4101-931c-f752b3343092"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.630563 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.631039 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/238ba769-d3c7-4101-931c-f752b3343092-util\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:23 crc kubenswrapper[4747]: I0930 19:00:23.631062 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hsl2\" (UniqueName: \"kubernetes.io/projected/238ba769-d3c7-4101-931c-f752b3343092-kube-api-access-4hsl2\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:24 crc kubenswrapper[4747]: I0930 19:00:24.181683 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" event={"ID":"238ba769-d3c7-4101-931c-f752b3343092","Type":"ContainerDied","Data":"b694643b6cb26698dda0ec2dacb36328b89e5001334a0729c695b312c0e71a0a"} Sep 30 19:00:24 crc kubenswrapper[4747]: I0930 19:00:24.181733 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b694643b6cb26698dda0ec2dacb36328b89e5001334a0729c695b312c0e71a0a" Sep 30 19:00:24 crc kubenswrapper[4747]: I0930 19:00:24.181802 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk" Sep 30 19:00:25 crc kubenswrapper[4747]: I0930 19:00:25.187620 4747 generic.go:334] "Generic (PLEG): container finished" podID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerID="9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963" exitCode=0 Sep 30 19:00:25 crc kubenswrapper[4747]: I0930 19:00:25.187816 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zs2dg" event={"ID":"bb294eab-16c5-4037-ba26-6dae9c8d2558","Type":"ContainerDied","Data":"9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963"} Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.923397 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl"] Sep 30 19:00:26 crc kubenswrapper[4747]: E0930 19:00:26.924119 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="238ba769-d3c7-4101-931c-f752b3343092" containerName="util" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.924135 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="238ba769-d3c7-4101-931c-f752b3343092" containerName="util" Sep 30 19:00:26 crc kubenswrapper[4747]: E0930 19:00:26.924152 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="238ba769-d3c7-4101-931c-f752b3343092" containerName="pull" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.924158 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="238ba769-d3c7-4101-931c-f752b3343092" containerName="pull" Sep 30 19:00:26 crc kubenswrapper[4747]: E0930 19:00:26.924168 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="238ba769-d3c7-4101-931c-f752b3343092" containerName="extract" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.924175 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="238ba769-d3c7-4101-931c-f752b3343092" containerName="extract" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.924279 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="238ba769-d3c7-4101-931c-f752b3343092" containerName="extract" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.924818 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.928774 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.929058 4747 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-4l82v" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.929200 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Sep 30 19:00:26 crc kubenswrapper[4747]: I0930 19:00:26.954471 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl"] Sep 30 19:00:27 crc kubenswrapper[4747]: I0930 19:00:27.076178 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbpt\" (UniqueName: \"kubernetes.io/projected/c8f12dfe-913f-473d-a337-111c2a8311dc-kube-api-access-hdbpt\") pod \"cert-manager-operator-controller-manager-57cd46d6d-lkfnl\" (UID: \"c8f12dfe-913f-473d-a337-111c2a8311dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl" Sep 30 19:00:27 crc kubenswrapper[4747]: I0930 19:00:27.178754 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbpt\" (UniqueName: \"kubernetes.io/projected/c8f12dfe-913f-473d-a337-111c2a8311dc-kube-api-access-hdbpt\") pod \"cert-manager-operator-controller-manager-57cd46d6d-lkfnl\" (UID: \"c8f12dfe-913f-473d-a337-111c2a8311dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl" Sep 30 19:00:27 crc kubenswrapper[4747]: I0930 19:00:27.199809 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbpt\" (UniqueName: \"kubernetes.io/projected/c8f12dfe-913f-473d-a337-111c2a8311dc-kube-api-access-hdbpt\") pod \"cert-manager-operator-controller-manager-57cd46d6d-lkfnl\" (UID: \"c8f12dfe-913f-473d-a337-111c2a8311dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl" Sep 30 19:00:27 crc kubenswrapper[4747]: I0930 19:00:27.203560 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zs2dg" event={"ID":"bb294eab-16c5-4037-ba26-6dae9c8d2558","Type":"ContainerStarted","Data":"0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763"} Sep 30 19:00:27 crc kubenswrapper[4747]: I0930 19:00:27.224308 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zs2dg" podStartSLOduration=2.22773777 podStartE2EDuration="5.224286474s" podCreationTimestamp="2025-09-30 19:00:22 +0000 UTC" firstStartedPulling="2025-09-30 19:00:23.171776785 +0000 UTC m=+862.831256899" lastFinishedPulling="2025-09-30 19:00:26.168325489 +0000 UTC m=+865.827805603" observedRunningTime="2025-09-30 19:00:27.221942347 +0000 UTC m=+866.881422481" watchObservedRunningTime="2025-09-30 19:00:27.224286474 +0000 UTC m=+866.883766588" Sep 30 19:00:27 crc kubenswrapper[4747]: I0930 19:00:27.237796 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl" Sep 30 19:00:27 crc kubenswrapper[4747]: I0930 19:00:27.512323 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl"] Sep 30 19:00:27 crc kubenswrapper[4747]: W0930 19:00:27.519311 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f12dfe_913f_473d_a337_111c2a8311dc.slice/crio-5fb9b552c747184fb5be4b43c0df1453319520ff51f66987d993aaa8383c5bd7 WatchSource:0}: Error finding container 5fb9b552c747184fb5be4b43c0df1453319520ff51f66987d993aaa8383c5bd7: Status 404 returned error can't find the container with id 5fb9b552c747184fb5be4b43c0df1453319520ff51f66987d993aaa8383c5bd7 Sep 30 19:00:28 crc kubenswrapper[4747]: I0930 19:00:28.211662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl" event={"ID":"c8f12dfe-913f-473d-a337-111c2a8311dc","Type":"ContainerStarted","Data":"5fb9b552c747184fb5be4b43c0df1453319520ff51f66987d993aaa8383c5bd7"} Sep 30 19:00:32 crc kubenswrapper[4747]: I0930 19:00:32.745528 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:32 crc kubenswrapper[4747]: I0930 19:00:32.746072 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:32 crc kubenswrapper[4747]: I0930 19:00:32.836068 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:33 crc kubenswrapper[4747]: I0930 19:00:33.342723 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.195055 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zs2dg"] Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.263532 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl" event={"ID":"c8f12dfe-913f-473d-a337-111c2a8311dc","Type":"ContainerStarted","Data":"6d15873a168a601b74a439a1713fe3f2d1bd07c2e4083c73b9323857a53c6372"} Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.263656 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zs2dg" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerName="registry-server" containerID="cri-o://0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763" gracePeriod=2 Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.292668 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-lkfnl" podStartSLOduration=1.9088809150000001 podStartE2EDuration="9.292648361s" podCreationTimestamp="2025-09-30 19:00:26 +0000 UTC" firstStartedPulling="2025-09-30 19:00:27.521760914 +0000 UTC m=+867.181241018" lastFinishedPulling="2025-09-30 19:00:34.90552835 +0000 UTC m=+874.565008464" observedRunningTime="2025-09-30 19:00:35.287622228 +0000 UTC m=+874.947102362" watchObservedRunningTime="2025-09-30 19:00:35.292648361 +0000 UTC m=+874.952128485" Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.717838 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.835657 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-utilities\") pod \"bb294eab-16c5-4037-ba26-6dae9c8d2558\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.835752 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9w2j\" (UniqueName: \"kubernetes.io/projected/bb294eab-16c5-4037-ba26-6dae9c8d2558-kube-api-access-z9w2j\") pod \"bb294eab-16c5-4037-ba26-6dae9c8d2558\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.835783 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-catalog-content\") pod \"bb294eab-16c5-4037-ba26-6dae9c8d2558\" (UID: \"bb294eab-16c5-4037-ba26-6dae9c8d2558\") " Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.836706 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-utilities" (OuterVolumeSpecName: "utilities") pod "bb294eab-16c5-4037-ba26-6dae9c8d2558" (UID: "bb294eab-16c5-4037-ba26-6dae9c8d2558"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.841430 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb294eab-16c5-4037-ba26-6dae9c8d2558-kube-api-access-z9w2j" (OuterVolumeSpecName: "kube-api-access-z9w2j") pod "bb294eab-16c5-4037-ba26-6dae9c8d2558" (UID: "bb294eab-16c5-4037-ba26-6dae9c8d2558"). InnerVolumeSpecName "kube-api-access-z9w2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.880551 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb294eab-16c5-4037-ba26-6dae9c8d2558" (UID: "bb294eab-16c5-4037-ba26-6dae9c8d2558"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.937042 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9w2j\" (UniqueName: \"kubernetes.io/projected/bb294eab-16c5-4037-ba26-6dae9c8d2558-kube-api-access-z9w2j\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.937092 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:35 crc kubenswrapper[4747]: I0930 19:00:35.937115 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb294eab-16c5-4037-ba26-6dae9c8d2558-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.271292 4747 generic.go:334] "Generic (PLEG): container finished" podID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerID="0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763" exitCode=0 Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.271400 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zs2dg" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.272058 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zs2dg" event={"ID":"bb294eab-16c5-4037-ba26-6dae9c8d2558","Type":"ContainerDied","Data":"0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763"} Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.272087 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zs2dg" event={"ID":"bb294eab-16c5-4037-ba26-6dae9c8d2558","Type":"ContainerDied","Data":"f18de288ad98ba04b3488f2fc95b0d3a0457ca85437448b921703ad4330f3003"} Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.272105 4747 scope.go:117] "RemoveContainer" containerID="0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.294747 4747 scope.go:117] "RemoveContainer" containerID="9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.314849 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zs2dg"] Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.325437 4747 scope.go:117] "RemoveContainer" containerID="6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.327694 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zs2dg"] Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.345080 4747 scope.go:117] "RemoveContainer" containerID="0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763" Sep 30 19:00:36 crc kubenswrapper[4747]: E0930 19:00:36.345608 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763\": container with ID starting with 0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763 not found: ID does not exist" containerID="0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.345703 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763"} err="failed to get container status \"0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763\": rpc error: code = NotFound desc = could not find container \"0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763\": container with ID starting with 0852dc653b19be508c0f09d5df0b00e69f5719f216e08cca95d3ad0da791b763 not found: ID does not exist" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.345794 4747 scope.go:117] "RemoveContainer" containerID="9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963" Sep 30 19:00:36 crc kubenswrapper[4747]: E0930 19:00:36.346227 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963\": container with ID starting with 9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963 not found: ID does not exist" containerID="9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.346313 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963"} err="failed to get container status \"9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963\": rpc error: code = NotFound desc = could not find container \"9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963\": container with ID starting with 9d1945046765fcbb8336958e3e67b16d376ab99d222fd96b082d7f5829d3a963 not found: ID does not exist" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.346374 4747 scope.go:117] "RemoveContainer" containerID="6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a" Sep 30 19:00:36 crc kubenswrapper[4747]: E0930 19:00:36.347472 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a\": container with ID starting with 6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a not found: ID does not exist" containerID="6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a" Sep 30 19:00:36 crc kubenswrapper[4747]: I0930 19:00:36.347548 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a"} err="failed to get container status \"6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a\": rpc error: code = NotFound desc = could not find container \"6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a\": container with ID starting with 6df4608f5f35a9e40704e376fbbcc4359f227225f525b73f1b85551b1d9a428a not found: ID does not exist" Sep 30 19:00:37 crc kubenswrapper[4747]: I0930 19:00:37.095021 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" path="/var/lib/kubelet/pods/bb294eab-16c5-4037-ba26-6dae9c8d2558/volumes" Sep 30 19:00:37 crc kubenswrapper[4747]: I0930 19:00:37.656096 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:00:37 crc kubenswrapper[4747]: I0930 19:00:37.656182 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.183351 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-2qsp9"] Sep 30 19:00:39 crc kubenswrapper[4747]: E0930 19:00:39.183679 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerName="extract-utilities" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.183700 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerName="extract-utilities" Sep 30 19:00:39 crc kubenswrapper[4747]: E0930 19:00:39.183723 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerName="registry-server" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.183736 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerName="registry-server" Sep 30 19:00:39 crc kubenswrapper[4747]: E0930 19:00:39.183772 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerName="extract-content" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.183785 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerName="extract-content" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.184001 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb294eab-16c5-4037-ba26-6dae9c8d2558" containerName="registry-server" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.184614 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.187245 4747 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2f9xw" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.187245 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.188058 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.196578 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-2qsp9"] Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.291973 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/809dacb9-cd8e-4950-8092-bc3c647db303-bound-sa-token\") pod \"cert-manager-webhook-d969966f-2qsp9\" (UID: \"809dacb9-cd8e-4950-8092-bc3c647db303\") " pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.292228 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8wvm\" (UniqueName: \"kubernetes.io/projected/809dacb9-cd8e-4950-8092-bc3c647db303-kube-api-access-w8wvm\") pod \"cert-manager-webhook-d969966f-2qsp9\" (UID: \"809dacb9-cd8e-4950-8092-bc3c647db303\") " pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.393502 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/809dacb9-cd8e-4950-8092-bc3c647db303-bound-sa-token\") pod \"cert-manager-webhook-d969966f-2qsp9\" (UID: \"809dacb9-cd8e-4950-8092-bc3c647db303\") " pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.393670 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8wvm\" (UniqueName: \"kubernetes.io/projected/809dacb9-cd8e-4950-8092-bc3c647db303-kube-api-access-w8wvm\") pod \"cert-manager-webhook-d969966f-2qsp9\" (UID: \"809dacb9-cd8e-4950-8092-bc3c647db303\") " pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.423245 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8wvm\" (UniqueName: \"kubernetes.io/projected/809dacb9-cd8e-4950-8092-bc3c647db303-kube-api-access-w8wvm\") pod \"cert-manager-webhook-d969966f-2qsp9\" (UID: \"809dacb9-cd8e-4950-8092-bc3c647db303\") " pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.427680 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/809dacb9-cd8e-4950-8092-bc3c647db303-bound-sa-token\") pod \"cert-manager-webhook-d969966f-2qsp9\" (UID: \"809dacb9-cd8e-4950-8092-bc3c647db303\") " pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.508425 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.624692 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd"] Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.626121 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.631354 4747 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-76znl" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.634418 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd"] Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.696781 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w566t\" (UniqueName: \"kubernetes.io/projected/7469fdd1-8820-4af1-87ee-d4bd00a8f211-kube-api-access-w566t\") pod \"cert-manager-cainjector-7d9f95dbf-4slkd\" (UID: \"7469fdd1-8820-4af1-87ee-d4bd00a8f211\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.696871 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7469fdd1-8820-4af1-87ee-d4bd00a8f211-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4slkd\" (UID: \"7469fdd1-8820-4af1-87ee-d4bd00a8f211\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.798284 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w566t\" (UniqueName: \"kubernetes.io/projected/7469fdd1-8820-4af1-87ee-d4bd00a8f211-kube-api-access-w566t\") pod \"cert-manager-cainjector-7d9f95dbf-4slkd\" (UID: \"7469fdd1-8820-4af1-87ee-d4bd00a8f211\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.798393 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7469fdd1-8820-4af1-87ee-d4bd00a8f211-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4slkd\" (UID: \"7469fdd1-8820-4af1-87ee-d4bd00a8f211\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.816043 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w566t\" (UniqueName: \"kubernetes.io/projected/7469fdd1-8820-4af1-87ee-d4bd00a8f211-kube-api-access-w566t\") pod \"cert-manager-cainjector-7d9f95dbf-4slkd\" (UID: \"7469fdd1-8820-4af1-87ee-d4bd00a8f211\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.816757 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7469fdd1-8820-4af1-87ee-d4bd00a8f211-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4slkd\" (UID: \"7469fdd1-8820-4af1-87ee-d4bd00a8f211\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.958634 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" Sep 30 19:00:39 crc kubenswrapper[4747]: I0930 19:00:39.967608 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-2qsp9"] Sep 30 19:00:39 crc kubenswrapper[4747]: W0930 19:00:39.981242 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod809dacb9_cd8e_4950_8092_bc3c647db303.slice/crio-f0f59a81fccf5b3b60c1095ce7bfc1051af9de919d14984c204ed67aa3c18766 WatchSource:0}: Error finding container f0f59a81fccf5b3b60c1095ce7bfc1051af9de919d14984c204ed67aa3c18766: Status 404 returned error can't find the container with id f0f59a81fccf5b3b60c1095ce7bfc1051af9de919d14984c204ed67aa3c18766 Sep 30 19:00:40 crc kubenswrapper[4747]: I0930 19:00:40.198474 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd"] Sep 30 19:00:40 crc kubenswrapper[4747]: W0930 19:00:40.212850 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7469fdd1_8820_4af1_87ee_d4bd00a8f211.slice/crio-3ef8b8f0e649e5af81e85568f7f97a16432764ccb291a1f0a7e9513d3b029ec2 WatchSource:0}: Error finding container 3ef8b8f0e649e5af81e85568f7f97a16432764ccb291a1f0a7e9513d3b029ec2: Status 404 returned error can't find the container with id 3ef8b8f0e649e5af81e85568f7f97a16432764ccb291a1f0a7e9513d3b029ec2 Sep 30 19:00:40 crc kubenswrapper[4747]: I0930 19:00:40.296980 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" event={"ID":"7469fdd1-8820-4af1-87ee-d4bd00a8f211","Type":"ContainerStarted","Data":"3ef8b8f0e649e5af81e85568f7f97a16432764ccb291a1f0a7e9513d3b029ec2"} Sep 30 19:00:40 crc kubenswrapper[4747]: I0930 19:00:40.298311 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" event={"ID":"809dacb9-cd8e-4950-8092-bc3c647db303","Type":"ContainerStarted","Data":"f0f59a81fccf5b3b60c1095ce7bfc1051af9de919d14984c204ed67aa3c18766"} Sep 30 19:00:45 crc kubenswrapper[4747]: I0930 19:00:45.330746 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" event={"ID":"7469fdd1-8820-4af1-87ee-d4bd00a8f211","Type":"ContainerStarted","Data":"ee644b382ed1f6f75741499653afb452aa4a94210528df36d86425e7ae02040e"} Sep 30 19:00:45 crc kubenswrapper[4747]: I0930 19:00:45.332362 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" event={"ID":"809dacb9-cd8e-4950-8092-bc3c647db303","Type":"ContainerStarted","Data":"bbc35e75f33ee56eae19dea9db0128955ae12e6839688b9a2daaa1fd0f172999"} Sep 30 19:00:45 crc kubenswrapper[4747]: I0930 19:00:45.332468 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:45 crc kubenswrapper[4747]: I0930 19:00:45.362233 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4slkd" podStartSLOduration=1.806532599 podStartE2EDuration="6.362208556s" podCreationTimestamp="2025-09-30 19:00:39 +0000 UTC" firstStartedPulling="2025-09-30 19:00:40.215760129 +0000 UTC m=+879.875240253" lastFinishedPulling="2025-09-30 19:00:44.771436096 +0000 UTC m=+884.430916210" observedRunningTime="2025-09-30 19:00:45.356659878 +0000 UTC m=+885.016139992" watchObservedRunningTime="2025-09-30 19:00:45.362208556 +0000 UTC m=+885.021688660" Sep 30 19:00:49 crc kubenswrapper[4747]: I0930 19:00:49.512229 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" Sep 30 19:00:49 crc kubenswrapper[4747]: I0930 19:00:49.538124 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-2qsp9" podStartSLOduration=5.770122147 podStartE2EDuration="10.538099018s" podCreationTimestamp="2025-09-30 19:00:39 +0000 UTC" firstStartedPulling="2025-09-30 19:00:39.985667988 +0000 UTC m=+879.645148122" lastFinishedPulling="2025-09-30 19:00:44.753644889 +0000 UTC m=+884.413124993" observedRunningTime="2025-09-30 19:00:45.383288356 +0000 UTC m=+885.042768470" watchObservedRunningTime="2025-09-30 19:00:49.538099018 +0000 UTC m=+889.197579162" Sep 30 19:00:57 crc kubenswrapper[4747]: I0930 19:00:57.881967 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-q2nml"] Sep 30 19:00:57 crc kubenswrapper[4747]: I0930 19:00:57.884176 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" Sep 30 19:00:57 crc kubenswrapper[4747]: I0930 19:00:57.890280 4747 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-92rqq" Sep 30 19:00:57 crc kubenswrapper[4747]: I0930 19:00:57.897200 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvx4\" (UniqueName: \"kubernetes.io/projected/c8b2ec10-81fd-46e8-b73c-b8141264574c-kube-api-access-5dvx4\") pod \"cert-manager-7d4cc89fcb-q2nml\" (UID: \"c8b2ec10-81fd-46e8-b73c-b8141264574c\") " pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" Sep 30 19:00:57 crc kubenswrapper[4747]: I0930 19:00:57.897315 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8b2ec10-81fd-46e8-b73c-b8141264574c-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-q2nml\" (UID: \"c8b2ec10-81fd-46e8-b73c-b8141264574c\") " pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" Sep 30 19:00:57 crc kubenswrapper[4747]: I0930 19:00:57.899386 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-q2nml"] Sep 30 19:00:57 crc kubenswrapper[4747]: I0930 19:00:57.998863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvx4\" (UniqueName: \"kubernetes.io/projected/c8b2ec10-81fd-46e8-b73c-b8141264574c-kube-api-access-5dvx4\") pod \"cert-manager-7d4cc89fcb-q2nml\" (UID: \"c8b2ec10-81fd-46e8-b73c-b8141264574c\") " pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" Sep 30 19:00:57 crc kubenswrapper[4747]: I0930 19:00:57.999024 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8b2ec10-81fd-46e8-b73c-b8141264574c-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-q2nml\" (UID: \"c8b2ec10-81fd-46e8-b73c-b8141264574c\") " pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" Sep 30 19:00:58 crc kubenswrapper[4747]: I0930 19:00:58.032576 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvx4\" (UniqueName: \"kubernetes.io/projected/c8b2ec10-81fd-46e8-b73c-b8141264574c-kube-api-access-5dvx4\") pod \"cert-manager-7d4cc89fcb-q2nml\" (UID: \"c8b2ec10-81fd-46e8-b73c-b8141264574c\") " pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" Sep 30 19:00:58 crc kubenswrapper[4747]: I0930 19:00:58.036883 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8b2ec10-81fd-46e8-b73c-b8141264574c-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-q2nml\" (UID: \"c8b2ec10-81fd-46e8-b73c-b8141264574c\") " pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" Sep 30 19:00:58 crc kubenswrapper[4747]: I0930 19:00:58.219648 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" Sep 30 19:00:58 crc kubenswrapper[4747]: I0930 19:00:58.718554 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-q2nml"] Sep 30 19:00:58 crc kubenswrapper[4747]: W0930 19:00:58.726366 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b2ec10_81fd_46e8_b73c_b8141264574c.slice/crio-efd19554f60b4f8822ab061dcb46556e4c93803bd700c785aa7830d41c06b51b WatchSource:0}: Error finding container efd19554f60b4f8822ab061dcb46556e4c93803bd700c785aa7830d41c06b51b: Status 404 returned error can't find the container with id efd19554f60b4f8822ab061dcb46556e4c93803bd700c785aa7830d41c06b51b Sep 30 19:00:59 crc kubenswrapper[4747]: I0930 19:00:59.456225 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" event={"ID":"c8b2ec10-81fd-46e8-b73c-b8141264574c","Type":"ContainerStarted","Data":"2bbf59d45ec74c49c0dec1809c3515bdaba8d477be2348cb7efda6751cded2e2"} Sep 30 19:00:59 crc kubenswrapper[4747]: I0930 19:00:59.456819 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" event={"ID":"c8b2ec10-81fd-46e8-b73c-b8141264574c","Type":"ContainerStarted","Data":"efd19554f60b4f8822ab061dcb46556e4c93803bd700c785aa7830d41c06b51b"} Sep 30 19:00:59 crc kubenswrapper[4747]: I0930 19:00:59.489367 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-q2nml" podStartSLOduration=2.489309324 podStartE2EDuration="2.489309324s" podCreationTimestamp="2025-09-30 19:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:00:59.479014311 +0000 UTC m=+899.138494465" watchObservedRunningTime="2025-09-30 19:00:59.489309324 +0000 UTC m=+899.148789488" Sep 30 19:01:02 crc kubenswrapper[4747]: I0930 19:01:02.889206 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vm69b"] Sep 30 19:01:02 crc kubenswrapper[4747]: I0930 19:01:02.891506 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vm69b" Sep 30 19:01:02 crc kubenswrapper[4747]: I0930 19:01:02.894253 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 19:01:02 crc kubenswrapper[4747]: I0930 19:01:02.895062 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 19:01:02 crc kubenswrapper[4747]: I0930 19:01:02.898323 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-dtb4g" Sep 30 19:01:02 crc kubenswrapper[4747]: I0930 19:01:02.962050 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vm69b"] Sep 30 19:01:03 crc kubenswrapper[4747]: I0930 19:01:03.080526 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr2jl\" (UniqueName: \"kubernetes.io/projected/a1fb3edf-a069-4124-bf98-460399fbadcd-kube-api-access-qr2jl\") pod \"openstack-operator-index-vm69b\" (UID: \"a1fb3edf-a069-4124-bf98-460399fbadcd\") " pod="openstack-operators/openstack-operator-index-vm69b" Sep 30 19:01:03 crc kubenswrapper[4747]: I0930 19:01:03.182526 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr2jl\" (UniqueName: \"kubernetes.io/projected/a1fb3edf-a069-4124-bf98-460399fbadcd-kube-api-access-qr2jl\") pod \"openstack-operator-index-vm69b\" (UID: \"a1fb3edf-a069-4124-bf98-460399fbadcd\") " pod="openstack-operators/openstack-operator-index-vm69b" Sep 30 19:01:03 crc kubenswrapper[4747]: I0930 19:01:03.204396 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr2jl\" (UniqueName: \"kubernetes.io/projected/a1fb3edf-a069-4124-bf98-460399fbadcd-kube-api-access-qr2jl\") pod \"openstack-operator-index-vm69b\" (UID: \"a1fb3edf-a069-4124-bf98-460399fbadcd\") " pod="openstack-operators/openstack-operator-index-vm69b" Sep 30 19:01:03 crc kubenswrapper[4747]: I0930 19:01:03.245235 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vm69b" Sep 30 19:01:03 crc kubenswrapper[4747]: I0930 19:01:03.521827 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vm69b"] Sep 30 19:01:03 crc kubenswrapper[4747]: W0930 19:01:03.539210 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1fb3edf_a069_4124_bf98_460399fbadcd.slice/crio-b462717a959a61097789d16b7b752ea37f0fe5d8a73480fc06a54d70a8d9989e WatchSource:0}: Error finding container b462717a959a61097789d16b7b752ea37f0fe5d8a73480fc06a54d70a8d9989e: Status 404 returned error can't find the container with id b462717a959a61097789d16b7b752ea37f0fe5d8a73480fc06a54d70a8d9989e Sep 30 19:01:04 crc kubenswrapper[4747]: I0930 19:01:04.495856 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vm69b" event={"ID":"a1fb3edf-a069-4124-bf98-460399fbadcd","Type":"ContainerStarted","Data":"b462717a959a61097789d16b7b752ea37f0fe5d8a73480fc06a54d70a8d9989e"} Sep 30 19:01:05 crc kubenswrapper[4747]: I0930 19:01:05.655919 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vm69b"] Sep 30 19:01:06 crc kubenswrapper[4747]: I0930 19:01:06.260592 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-drgjp"] Sep 30 19:01:06 crc kubenswrapper[4747]: I0930 19:01:06.261355 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:06 crc kubenswrapper[4747]: I0930 19:01:06.333190 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-drgjp"] Sep 30 19:01:06 crc kubenswrapper[4747]: I0930 19:01:06.431037 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gphv2\" (UniqueName: \"kubernetes.io/projected/ea91f08a-965d-42d3-bad0-62e39f0c442f-kube-api-access-gphv2\") pod \"openstack-operator-index-drgjp\" (UID: \"ea91f08a-965d-42d3-bad0-62e39f0c442f\") " pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:06 crc kubenswrapper[4747]: I0930 19:01:06.532218 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gphv2\" (UniqueName: \"kubernetes.io/projected/ea91f08a-965d-42d3-bad0-62e39f0c442f-kube-api-access-gphv2\") pod \"openstack-operator-index-drgjp\" (UID: \"ea91f08a-965d-42d3-bad0-62e39f0c442f\") " pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:06 crc kubenswrapper[4747]: I0930 19:01:06.556165 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gphv2\" (UniqueName: \"kubernetes.io/projected/ea91f08a-965d-42d3-bad0-62e39f0c442f-kube-api-access-gphv2\") pod \"openstack-operator-index-drgjp\" (UID: \"ea91f08a-965d-42d3-bad0-62e39f0c442f\") " pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:06 crc kubenswrapper[4747]: I0930 19:01:06.586990 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:06 crc kubenswrapper[4747]: I0930 19:01:06.855261 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-drgjp"] Sep 30 19:01:06 crc kubenswrapper[4747]: W0930 19:01:06.866321 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea91f08a_965d_42d3_bad0_62e39f0c442f.slice/crio-ce939432f9cff5a538977b2ac81f51dd8bac9ee0ffe9cc5b41bde9e912731f12 WatchSource:0}: Error finding container ce939432f9cff5a538977b2ac81f51dd8bac9ee0ffe9cc5b41bde9e912731f12: Status 404 returned error can't find the container with id ce939432f9cff5a538977b2ac81f51dd8bac9ee0ffe9cc5b41bde9e912731f12 Sep 30 19:01:07 crc kubenswrapper[4747]: I0930 19:01:07.524230 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-drgjp" event={"ID":"ea91f08a-965d-42d3-bad0-62e39f0c442f","Type":"ContainerStarted","Data":"14d4a2c171649643bb41a635bd73bf5c28f8f63af3c928d253bcd72587360dff"} Sep 30 19:01:07 crc kubenswrapper[4747]: I0930 19:01:07.524681 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-drgjp" event={"ID":"ea91f08a-965d-42d3-bad0-62e39f0c442f","Type":"ContainerStarted","Data":"ce939432f9cff5a538977b2ac81f51dd8bac9ee0ffe9cc5b41bde9e912731f12"} Sep 30 19:01:07 crc kubenswrapper[4747]: I0930 19:01:07.526469 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vm69b" event={"ID":"a1fb3edf-a069-4124-bf98-460399fbadcd","Type":"ContainerStarted","Data":"beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055"} Sep 30 19:01:07 crc kubenswrapper[4747]: I0930 19:01:07.526643 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-vm69b" podUID="a1fb3edf-a069-4124-bf98-460399fbadcd" containerName="registry-server" containerID="cri-o://beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055" gracePeriod=2 Sep 30 19:01:07 crc kubenswrapper[4747]: I0930 19:01:07.548833 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-drgjp" podStartSLOduration=1.5021984609999999 podStartE2EDuration="1.548800248s" podCreationTimestamp="2025-09-30 19:01:06 +0000 UTC" firstStartedPulling="2025-09-30 19:01:06.87131378 +0000 UTC m=+906.530793894" lastFinishedPulling="2025-09-30 19:01:06.917915557 +0000 UTC m=+906.577395681" observedRunningTime="2025-09-30 19:01:07.544362212 +0000 UTC m=+907.203842386" watchObservedRunningTime="2025-09-30 19:01:07.548800248 +0000 UTC m=+907.208280392" Sep 30 19:01:07 crc kubenswrapper[4747]: I0930 19:01:07.572344 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vm69b" podStartSLOduration=2.602069232 podStartE2EDuration="5.572315728s" podCreationTimestamp="2025-09-30 19:01:02 +0000 UTC" firstStartedPulling="2025-09-30 19:01:03.546969662 +0000 UTC m=+903.206449786" lastFinishedPulling="2025-09-30 19:01:06.517216138 +0000 UTC m=+906.176696282" observedRunningTime="2025-09-30 19:01:07.56185678 +0000 UTC m=+907.221336934" watchObservedRunningTime="2025-09-30 19:01:07.572315728 +0000 UTC m=+907.231795872" Sep 30 19:01:07 crc kubenswrapper[4747]: I0930 19:01:07.655526 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:01:07 crc kubenswrapper[4747]: I0930 19:01:07.655603 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.013824 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vm69b" Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.155718 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr2jl\" (UniqueName: \"kubernetes.io/projected/a1fb3edf-a069-4124-bf98-460399fbadcd-kube-api-access-qr2jl\") pod \"a1fb3edf-a069-4124-bf98-460399fbadcd\" (UID: \"a1fb3edf-a069-4124-bf98-460399fbadcd\") " Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.163144 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1fb3edf-a069-4124-bf98-460399fbadcd-kube-api-access-qr2jl" (OuterVolumeSpecName: "kube-api-access-qr2jl") pod "a1fb3edf-a069-4124-bf98-460399fbadcd" (UID: "a1fb3edf-a069-4124-bf98-460399fbadcd"). InnerVolumeSpecName "kube-api-access-qr2jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.258117 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr2jl\" (UniqueName: \"kubernetes.io/projected/a1fb3edf-a069-4124-bf98-460399fbadcd-kube-api-access-qr2jl\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.536299 4747 generic.go:334] "Generic (PLEG): container finished" podID="a1fb3edf-a069-4124-bf98-460399fbadcd" containerID="beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055" exitCode=0 Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.537499 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vm69b" Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.537638 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vm69b" event={"ID":"a1fb3edf-a069-4124-bf98-460399fbadcd","Type":"ContainerDied","Data":"beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055"} Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.537689 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vm69b" event={"ID":"a1fb3edf-a069-4124-bf98-460399fbadcd","Type":"ContainerDied","Data":"b462717a959a61097789d16b7b752ea37f0fe5d8a73480fc06a54d70a8d9989e"} Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.537719 4747 scope.go:117] "RemoveContainer" containerID="beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055" Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.558233 4747 scope.go:117] "RemoveContainer" containerID="beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055" Sep 30 19:01:08 crc kubenswrapper[4747]: E0930 19:01:08.562438 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055\": container with ID starting with beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055 not found: ID does not exist" containerID="beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055" Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.562473 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055"} err="failed to get container status \"beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055\": rpc error: code = NotFound desc = could not find container \"beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055\": container with ID starting with beecc2a92ea443ded28e0321715c20b1539dc7b457d269ad9ea82fe8bb92e055 not found: ID does not exist" Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.581395 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vm69b"] Sep 30 19:01:08 crc kubenswrapper[4747]: I0930 19:01:08.589403 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-vm69b"] Sep 30 19:01:09 crc kubenswrapper[4747]: I0930 19:01:09.102020 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1fb3edf-a069-4124-bf98-460399fbadcd" path="/var/lib/kubelet/pods/a1fb3edf-a069-4124-bf98-460399fbadcd/volumes" Sep 30 19:01:16 crc kubenswrapper[4747]: I0930 19:01:16.587659 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:16 crc kubenswrapper[4747]: I0930 19:01:16.589201 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:16 crc kubenswrapper[4747]: I0930 19:01:16.625153 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:17 crc kubenswrapper[4747]: I0930 19:01:17.646220 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-drgjp" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.247063 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd"] Sep 30 19:01:24 crc kubenswrapper[4747]: E0930 19:01:24.247859 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1fb3edf-a069-4124-bf98-460399fbadcd" containerName="registry-server" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.247888 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1fb3edf-a069-4124-bf98-460399fbadcd" containerName="registry-server" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.248193 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1fb3edf-a069-4124-bf98-460399fbadcd" containerName="registry-server" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.250304 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.254297 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-skhg6" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.270539 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd"] Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.328013 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-util\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.328088 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-bundle\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.328130 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwtm\" (UniqueName: \"kubernetes.io/projected/81844e1b-ba6d-4561-8997-4b784d92c5da-kube-api-access-btwtm\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.430028 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-util\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.430521 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-bundle\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.430782 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btwtm\" (UniqueName: \"kubernetes.io/projected/81844e1b-ba6d-4561-8997-4b784d92c5da-kube-api-access-btwtm\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.431257 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-util\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.431359 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-bundle\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.465035 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwtm\" (UniqueName: \"kubernetes.io/projected/81844e1b-ba6d-4561-8997-4b784d92c5da-kube-api-access-btwtm\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.587578 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:24 crc kubenswrapper[4747]: I0930 19:01:24.909646 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd"] Sep 30 19:01:25 crc kubenswrapper[4747]: I0930 19:01:25.676226 4747 generic.go:334] "Generic (PLEG): container finished" podID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerID="01d335286e1e7fd58d564f7419e0b95f37bb3ef690a28be8fde1628fc711cb93" exitCode=0 Sep 30 19:01:25 crc kubenswrapper[4747]: I0930 19:01:25.676345 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" event={"ID":"81844e1b-ba6d-4561-8997-4b784d92c5da","Type":"ContainerDied","Data":"01d335286e1e7fd58d564f7419e0b95f37bb3ef690a28be8fde1628fc711cb93"} Sep 30 19:01:25 crc kubenswrapper[4747]: I0930 19:01:25.676701 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" event={"ID":"81844e1b-ba6d-4561-8997-4b784d92c5da","Type":"ContainerStarted","Data":"1f5bc262cfb0f34fe73f3fa9418acfaa444b92b0c726ad31f0404002fac9c5d9"} Sep 30 19:01:26 crc kubenswrapper[4747]: I0930 19:01:26.696239 4747 generic.go:334] "Generic (PLEG): container finished" podID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerID="c1414816d135ac3de8950815268b38d4e55608c144416fdb6094412ffc775e64" exitCode=0 Sep 30 19:01:26 crc kubenswrapper[4747]: I0930 19:01:26.696378 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" event={"ID":"81844e1b-ba6d-4561-8997-4b784d92c5da","Type":"ContainerDied","Data":"c1414816d135ac3de8950815268b38d4e55608c144416fdb6094412ffc775e64"} Sep 30 19:01:27 crc kubenswrapper[4747]: I0930 19:01:27.707071 4747 generic.go:334] "Generic (PLEG): container finished" podID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerID="32c8edbf893ea1ba798914e1fdc81d157af572f6e3b44927d484ee3dfedc58e4" exitCode=0 Sep 30 19:01:27 crc kubenswrapper[4747]: I0930 19:01:27.707219 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" event={"ID":"81844e1b-ba6d-4561-8997-4b784d92c5da","Type":"ContainerDied","Data":"32c8edbf893ea1ba798914e1fdc81d157af572f6e3b44927d484ee3dfedc58e4"} Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.069383 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.233637 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-bundle\") pod \"81844e1b-ba6d-4561-8997-4b784d92c5da\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.233729 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-util\") pod \"81844e1b-ba6d-4561-8997-4b784d92c5da\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.233773 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btwtm\" (UniqueName: \"kubernetes.io/projected/81844e1b-ba6d-4561-8997-4b784d92c5da-kube-api-access-btwtm\") pod \"81844e1b-ba6d-4561-8997-4b784d92c5da\" (UID: \"81844e1b-ba6d-4561-8997-4b784d92c5da\") " Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.234578 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-bundle" (OuterVolumeSpecName: "bundle") pod "81844e1b-ba6d-4561-8997-4b784d92c5da" (UID: "81844e1b-ba6d-4561-8997-4b784d92c5da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.243124 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81844e1b-ba6d-4561-8997-4b784d92c5da-kube-api-access-btwtm" (OuterVolumeSpecName: "kube-api-access-btwtm") pod "81844e1b-ba6d-4561-8997-4b784d92c5da" (UID: "81844e1b-ba6d-4561-8997-4b784d92c5da"). InnerVolumeSpecName "kube-api-access-btwtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.264394 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-util" (OuterVolumeSpecName: "util") pod "81844e1b-ba6d-4561-8997-4b784d92c5da" (UID: "81844e1b-ba6d-4561-8997-4b784d92c5da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.335301 4747 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.335353 4747 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81844e1b-ba6d-4561-8997-4b784d92c5da-util\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.335373 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btwtm\" (UniqueName: \"kubernetes.io/projected/81844e1b-ba6d-4561-8997-4b784d92c5da-kube-api-access-btwtm\") on node \"crc\" DevicePath \"\"" Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.731233 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" event={"ID":"81844e1b-ba6d-4561-8997-4b784d92c5da","Type":"ContainerDied","Data":"1f5bc262cfb0f34fe73f3fa9418acfaa444b92b0c726ad31f0404002fac9c5d9"} Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.731295 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f5bc262cfb0f34fe73f3fa9418acfaa444b92b0c726ad31f0404002fac9c5d9" Sep 30 19:01:29 crc kubenswrapper[4747]: I0930 19:01:29.731311 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd" Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.812828 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f"] Sep 30 19:01:36 crc kubenswrapper[4747]: E0930 19:01:36.813949 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerName="pull" Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.813974 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerName="pull" Sep 30 19:01:36 crc kubenswrapper[4747]: E0930 19:01:36.814012 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerName="util" Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.814025 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerName="util" Sep 30 19:01:36 crc kubenswrapper[4747]: E0930 19:01:36.814038 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerName="extract" Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.814051 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerName="extract" Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.814226 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="81844e1b-ba6d-4561-8997-4b784d92c5da" containerName="extract" Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.815659 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.819860 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-zb574" Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.847345 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f"] Sep 30 19:01:36 crc kubenswrapper[4747]: I0930 19:01:36.956443 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bk4\" (UniqueName: \"kubernetes.io/projected/8ae757f1-995e-4b91-ac4f-20962f2a7072-kube-api-access-b8bk4\") pod \"openstack-operator-controller-operator-d8fdfd448-nhb7f\" (UID: \"8ae757f1-995e-4b91-ac4f-20962f2a7072\") " pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.057793 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8bk4\" (UniqueName: \"kubernetes.io/projected/8ae757f1-995e-4b91-ac4f-20962f2a7072-kube-api-access-b8bk4\") pod \"openstack-operator-controller-operator-d8fdfd448-nhb7f\" (UID: \"8ae757f1-995e-4b91-ac4f-20962f2a7072\") " pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.080261 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8bk4\" (UniqueName: \"kubernetes.io/projected/8ae757f1-995e-4b91-ac4f-20962f2a7072-kube-api-access-b8bk4\") pod \"openstack-operator-controller-operator-d8fdfd448-nhb7f\" (UID: \"8ae757f1-995e-4b91-ac4f-20962f2a7072\") " pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.132541 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.393274 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f"] Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.654966 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.655016 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.655096 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.655799 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9324a2c247fa6850748cdd90467f095bd666e1119af1ca69c9f4d4385e9867bb"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.655900 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://9324a2c247fa6850748cdd90467f095bd666e1119af1ca69c9f4d4385e9867bb" gracePeriod=600 Sep 30 19:01:37 crc kubenswrapper[4747]: I0930 19:01:37.799732 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" event={"ID":"8ae757f1-995e-4b91-ac4f-20962f2a7072","Type":"ContainerStarted","Data":"66d10fbb0cd73b26fb7e193e9a69ebb86ca3e49d5c2796e48d536b7508f6ce8b"} Sep 30 19:01:38 crc kubenswrapper[4747]: I0930 19:01:38.814603 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="9324a2c247fa6850748cdd90467f095bd666e1119af1ca69c9f4d4385e9867bb" exitCode=0 Sep 30 19:01:38 crc kubenswrapper[4747]: I0930 19:01:38.814668 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"9324a2c247fa6850748cdd90467f095bd666e1119af1ca69c9f4d4385e9867bb"} Sep 30 19:01:38 crc kubenswrapper[4747]: I0930 19:01:38.816037 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"2d1099e7b2e4398f262beea3d468545e7d66ad71310f8a69af5aae38ae2c6601"} Sep 30 19:01:38 crc kubenswrapper[4747]: I0930 19:01:38.816059 4747 scope.go:117] "RemoveContainer" containerID="f3b9f45b84cc1eae815bcdc0ad8efb2eb78da9ac4324427d149fbbf26250b353" Sep 30 19:01:41 crc kubenswrapper[4747]: I0930 19:01:41.851874 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" event={"ID":"8ae757f1-995e-4b91-ac4f-20962f2a7072","Type":"ContainerStarted","Data":"6e8c317f255cb63f7efa8e7f10f3b2e4d7f4cfdfdd2d2b59ec327001b5bad7eb"} Sep 30 19:01:44 crc kubenswrapper[4747]: I0930 19:01:44.892938 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" event={"ID":"8ae757f1-995e-4b91-ac4f-20962f2a7072","Type":"ContainerStarted","Data":"35c6bca088681c61adf24d535644b02d857e340bfb79186625689a6101833a74"} Sep 30 19:01:44 crc kubenswrapper[4747]: I0930 19:01:44.893554 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" Sep 30 19:01:44 crc kubenswrapper[4747]: I0930 19:01:44.940944 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" podStartSLOduration=2.585743043 podStartE2EDuration="8.940918904s" podCreationTimestamp="2025-09-30 19:01:36 +0000 UTC" firstStartedPulling="2025-09-30 19:01:37.399732436 +0000 UTC m=+937.059212550" lastFinishedPulling="2025-09-30 19:01:43.754908297 +0000 UTC m=+943.414388411" observedRunningTime="2025-09-30 19:01:44.935048407 +0000 UTC m=+944.594528531" watchObservedRunningTime="2025-09-30 19:01:44.940918904 +0000 UTC m=+944.600399018" Sep 30 19:01:47 crc kubenswrapper[4747]: I0930 19:01:47.135389 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-nhb7f" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.544174 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.545866 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.549146 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-n2sbj" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.552224 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.553502 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.555676 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bl9jj" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.557617 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.613643 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.614816 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.620497 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-n4vbj" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.636800 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.646212 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.657590 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.663594 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.668142 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-22kws" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.678085 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.682452 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.685791 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.688982 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mqkhb" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.695110 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.715097 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.716051 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.719255 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rnqp5" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.735314 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.736148 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rvk\" (UniqueName: \"kubernetes.io/projected/0e3fd103-b88a-4289-a0f3-959c5d1de5d3-kube-api-access-j8rvk\") pod \"barbican-operator-controller-manager-6ff8b75857-g6kv2\" (UID: \"0e3fd103-b88a-4289-a0f3-959c5d1de5d3\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.736209 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x8hh\" (UniqueName: \"kubernetes.io/projected/b99836bf-ec49-4751-a4ab-0656ad583daf-kube-api-access-4x8hh\") pod \"cinder-operator-controller-manager-644bddb6d8-lrcwn\" (UID: \"b99836bf-ec49-4751-a4ab-0656ad583daf\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.736254 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xb8x\" (UniqueName: \"kubernetes.io/projected/9b526a9f-9938-4391-903c-6c5a861256d3-kube-api-access-7xb8x\") pod \"designate-operator-controller-manager-84f4f7b77b-4x7bb\" (UID: \"9b526a9f-9938-4391-903c-6c5a861256d3\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.741125 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.763633 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.763801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.766656 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.768470 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-69dcm" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.801156 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.802162 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.808344 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.808689 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rxs6x" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.809399 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.812461 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.814243 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xb95h" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.827516 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.837011 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98hnh\" (UniqueName: \"kubernetes.io/projected/dc5a3b8a-b0b5-40d5-a3b8-806105406c70-kube-api-access-98hnh\") pod \"glance-operator-controller-manager-84958c4d49-pzbbf\" (UID: \"dc5a3b8a-b0b5-40d5-a3b8-806105406c70\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.837060 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mswc\" (UniqueName: \"kubernetes.io/projected/8d0a9431-6171-433c-a100-8d36b93e3422-kube-api-access-6mswc\") pod \"heat-operator-controller-manager-5d889d78cf-62mgn\" (UID: \"8d0a9431-6171-433c-a100-8d36b93e3422\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.837085 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rvk\" (UniqueName: \"kubernetes.io/projected/0e3fd103-b88a-4289-a0f3-959c5d1de5d3-kube-api-access-j8rvk\") pod \"barbican-operator-controller-manager-6ff8b75857-g6kv2\" (UID: \"0e3fd103-b88a-4289-a0f3-959c5d1de5d3\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.837125 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt94c\" (UniqueName: \"kubernetes.io/projected/ad81d435-bedc-463a-a8c6-346ab3b5ee5a-kube-api-access-kt94c\") pod \"horizon-operator-controller-manager-9f4696d94-x5qln\" (UID: \"ad81d435-bedc-463a-a8c6-346ab3b5ee5a\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.837152 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x8hh\" (UniqueName: \"kubernetes.io/projected/b99836bf-ec49-4751-a4ab-0656ad583daf-kube-api-access-4x8hh\") pod \"cinder-operator-controller-manager-644bddb6d8-lrcwn\" (UID: \"b99836bf-ec49-4751-a4ab-0656ad583daf\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.837201 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xb8x\" (UniqueName: \"kubernetes.io/projected/9b526a9f-9938-4391-903c-6c5a861256d3-kube-api-access-7xb8x\") pod \"designate-operator-controller-manager-84f4f7b77b-4x7bb\" (UID: \"9b526a9f-9938-4391-903c-6c5a861256d3\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.840340 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.841485 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.844588 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-l5fnw" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.849463 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.867201 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x8hh\" (UniqueName: \"kubernetes.io/projected/b99836bf-ec49-4751-a4ab-0656ad583daf-kube-api-access-4x8hh\") pod \"cinder-operator-controller-manager-644bddb6d8-lrcwn\" (UID: \"b99836bf-ec49-4751-a4ab-0656ad583daf\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.867423 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xb8x\" (UniqueName: \"kubernetes.io/projected/9b526a9f-9938-4391-903c-6c5a861256d3-kube-api-access-7xb8x\") pod \"designate-operator-controller-manager-84f4f7b77b-4x7bb\" (UID: \"9b526a9f-9938-4391-903c-6c5a861256d3\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.867716 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rvk\" (UniqueName: \"kubernetes.io/projected/0e3fd103-b88a-4289-a0f3-959c5d1de5d3-kube-api-access-j8rvk\") pod \"barbican-operator-controller-manager-6ff8b75857-g6kv2\" (UID: \"0e3fd103-b88a-4289-a0f3-959c5d1de5d3\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.885299 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.892734 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.892861 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.896235 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-f9wz8" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.903146 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.904122 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.913000 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-9gf8c" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.919196 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.922284 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.923306 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.925411 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-lrqbd" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.927794 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.937831 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhdx\" (UniqueName: \"kubernetes.io/projected/da47eadd-a352-46c7-84aa-fbc25bfac106-kube-api-access-dwhdx\") pod \"manila-operator-controller-manager-6d68dbc695-t4kgw\" (UID: \"da47eadd-a352-46c7-84aa-fbc25bfac106\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.937874 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpn8w\" (UniqueName: \"kubernetes.io/projected/48f86fea-23fe-488b-8429-7e97c676675b-kube-api-access-mpn8w\") pod \"ironic-operator-controller-manager-7975b88857-np9sf\" (UID: \"48f86fea-23fe-488b-8429-7e97c676675b\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.937937 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98hnh\" (UniqueName: \"kubernetes.io/projected/dc5a3b8a-b0b5-40d5-a3b8-806105406c70-kube-api-access-98hnh\") pod \"glance-operator-controller-manager-84958c4d49-pzbbf\" (UID: \"dc5a3b8a-b0b5-40d5-a3b8-806105406c70\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.937957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mswc\" (UniqueName: \"kubernetes.io/projected/8d0a9431-6171-433c-a100-8d36b93e3422-kube-api-access-6mswc\") pod \"heat-operator-controller-manager-5d889d78cf-62mgn\" (UID: \"8d0a9431-6171-433c-a100-8d36b93e3422\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.937983 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvd62\" (UniqueName: \"kubernetes.io/projected/eeb2c9bd-6467-4105-a4b9-800c69b815d0-kube-api-access-rvd62\") pod \"infra-operator-controller-manager-7d857cc749-sfgn2\" (UID: \"eeb2c9bd-6467-4105-a4b9-800c69b815d0\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.938006 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8czv\" (UniqueName: \"kubernetes.io/projected/f925f84a-6e3d-45be-82cf-320f7654adb0-kube-api-access-g8czv\") pod \"keystone-operator-controller-manager-5bd55b4bff-9j6gj\" (UID: \"f925f84a-6e3d-45be-82cf-320f7654adb0\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.938025 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt94c\" (UniqueName: \"kubernetes.io/projected/ad81d435-bedc-463a-a8c6-346ab3b5ee5a-kube-api-access-kt94c\") pod \"horizon-operator-controller-manager-9f4696d94-x5qln\" (UID: \"ad81d435-bedc-463a-a8c6-346ab3b5ee5a\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.938043 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb2c9bd-6467-4105-a4b9-800c69b815d0-cert\") pod \"infra-operator-controller-manager-7d857cc749-sfgn2\" (UID: \"eeb2c9bd-6467-4105-a4b9-800c69b815d0\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.938500 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.948695 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.958887 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.964187 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt94c\" (UniqueName: \"kubernetes.io/projected/ad81d435-bedc-463a-a8c6-346ab3b5ee5a-kube-api-access-kt94c\") pod \"horizon-operator-controller-manager-9f4696d94-x5qln\" (UID: \"ad81d435-bedc-463a-a8c6-346ab3b5ee5a\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.964681 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mswc\" (UniqueName: \"kubernetes.io/projected/8d0a9431-6171-433c-a100-8d36b93e3422-kube-api-access-6mswc\") pod \"heat-operator-controller-manager-5d889d78cf-62mgn\" (UID: \"8d0a9431-6171-433c-a100-8d36b93e3422\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.966532 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98hnh\" (UniqueName: \"kubernetes.io/projected/dc5a3b8a-b0b5-40d5-a3b8-806105406c70-kube-api-access-98hnh\") pod \"glance-operator-controller-manager-84958c4d49-pzbbf\" (UID: \"dc5a3b8a-b0b5-40d5-a3b8-806105406c70\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.979984 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.982765 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.983984 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.986263 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-g9mk8" Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.993509 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq"] Sep 30 19:02:20 crc kubenswrapper[4747]: I0930 19:02:20.998597 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:20.999825 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.002049 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pw9x4" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.002250 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.006634 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.007774 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.012286 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-mwhps" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.012361 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.012584 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.013383 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.017954 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.019444 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-mg64c" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.031657 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.031964 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.041674 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvd62\" (UniqueName: \"kubernetes.io/projected/eeb2c9bd-6467-4105-a4b9-800c69b815d0-kube-api-access-rvd62\") pod \"infra-operator-controller-manager-7d857cc749-sfgn2\" (UID: \"eeb2c9bd-6467-4105-a4b9-800c69b815d0\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.041727 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmwpn\" (UniqueName: \"kubernetes.io/projected/4159cab8-0891-49ef-9293-425697a48dcd-kube-api-access-tmwpn\") pod \"mariadb-operator-controller-manager-88c7-vkxhv\" (UID: \"4159cab8-0891-49ef-9293-425697a48dcd\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.041751 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8czv\" (UniqueName: \"kubernetes.io/projected/f925f84a-6e3d-45be-82cf-320f7654adb0-kube-api-access-g8czv\") pod \"keystone-operator-controller-manager-5bd55b4bff-9j6gj\" (UID: \"f925f84a-6e3d-45be-82cf-320f7654adb0\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.041779 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb2c9bd-6467-4105-a4b9-800c69b815d0-cert\") pod \"infra-operator-controller-manager-7d857cc749-sfgn2\" (UID: \"eeb2c9bd-6467-4105-a4b9-800c69b815d0\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.041815 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhdx\" (UniqueName: \"kubernetes.io/projected/da47eadd-a352-46c7-84aa-fbc25bfac106-kube-api-access-dwhdx\") pod \"manila-operator-controller-manager-6d68dbc695-t4kgw\" (UID: \"da47eadd-a352-46c7-84aa-fbc25bfac106\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.041842 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndb78\" (UniqueName: \"kubernetes.io/projected/2ac6e9ef-b559-4a14-861a-84c9a2308e06-kube-api-access-ndb78\") pod \"neutron-operator-controller-manager-64d7b59854-xczb4\" (UID: \"2ac6e9ef-b559-4a14-861a-84c9a2308e06\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.041869 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpn8w\" (UniqueName: \"kubernetes.io/projected/48f86fea-23fe-488b-8429-7e97c676675b-kube-api-access-mpn8w\") pod \"ironic-operator-controller-manager-7975b88857-np9sf\" (UID: \"48f86fea-23fe-488b-8429-7e97c676675b\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.041912 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghwn\" (UniqueName: \"kubernetes.io/projected/f508c857-c282-4b4e-9d93-e7358f3aad9d-kube-api-access-bghwn\") pod \"nova-operator-controller-manager-c7c776c96-lxkkz\" (UID: \"f508c857-c282-4b4e-9d93-e7358f3aad9d\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.046676 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.048388 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eeb2c9bd-6467-4105-a4b9-800c69b815d0-cert\") pod \"infra-operator-controller-manager-7d857cc749-sfgn2\" (UID: \"eeb2c9bd-6467-4105-a4b9-800c69b815d0\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.058341 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.061475 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8czv\" (UniqueName: \"kubernetes.io/projected/f925f84a-6e3d-45be-82cf-320f7654adb0-kube-api-access-g8czv\") pod \"keystone-operator-controller-manager-5bd55b4bff-9j6gj\" (UID: \"f925f84a-6e3d-45be-82cf-320f7654adb0\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.063459 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvd62\" (UniqueName: \"kubernetes.io/projected/eeb2c9bd-6467-4105-a4b9-800c69b815d0-kube-api-access-rvd62\") pod \"infra-operator-controller-manager-7d857cc749-sfgn2\" (UID: \"eeb2c9bd-6467-4105-a4b9-800c69b815d0\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.068976 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.069074 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.070471 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhdx\" (UniqueName: \"kubernetes.io/projected/da47eadd-a352-46c7-84aa-fbc25bfac106-kube-api-access-dwhdx\") pod \"manila-operator-controller-manager-6d68dbc695-t4kgw\" (UID: \"da47eadd-a352-46c7-84aa-fbc25bfac106\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.070704 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8k5q7" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.071440 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpn8w\" (UniqueName: \"kubernetes.io/projected/48f86fea-23fe-488b-8429-7e97c676675b-kube-api-access-mpn8w\") pod \"ironic-operator-controller-manager-7975b88857-np9sf\" (UID: \"48f86fea-23fe-488b-8429-7e97c676675b\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.085112 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.095865 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.096048 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.135338 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.144390 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8lng\" (UniqueName: \"kubernetes.io/projected/b521c8df-d702-482e-b6cc-d04577dc0936-kube-api-access-g8lng\") pod \"ovn-operator-controller-manager-9976ff44c-xnttg\" (UID: \"b521c8df-d702-482e-b6cc-d04577dc0936\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.144462 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndb78\" (UniqueName: \"kubernetes.io/projected/2ac6e9ef-b559-4a14-861a-84c9a2308e06-kube-api-access-ndb78\") pod \"neutron-operator-controller-manager-64d7b59854-xczb4\" (UID: \"2ac6e9ef-b559-4a14-861a-84c9a2308e06\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.145554 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-qtkv7" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.145626 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.148018 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.148271 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb445\" (UniqueName: \"kubernetes.io/projected/d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a-kube-api-access-gb445\") pod \"placement-operator-controller-manager-589c58c6c-jm22c\" (UID: \"d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.148309 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq\" (UID: \"72272c30-dd3c-4779-bea8-e65f9e5ecd06\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.148341 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bghwn\" (UniqueName: \"kubernetes.io/projected/f508c857-c282-4b4e-9d93-e7358f3aad9d-kube-api-access-bghwn\") pod \"nova-operator-controller-manager-c7c776c96-lxkkz\" (UID: \"f508c857-c282-4b4e-9d93-e7358f3aad9d\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.148374 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gdl\" (UniqueName: \"kubernetes.io/projected/72272c30-dd3c-4779-bea8-e65f9e5ecd06-kube-api-access-b5gdl\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq\" (UID: \"72272c30-dd3c-4779-bea8-e65f9e5ecd06\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.148419 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmwpn\" (UniqueName: \"kubernetes.io/projected/4159cab8-0891-49ef-9293-425697a48dcd-kube-api-access-tmwpn\") pod \"mariadb-operator-controller-manager-88c7-vkxhv\" (UID: \"4159cab8-0891-49ef-9293-425697a48dcd\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.150647 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nfcx\" (UniqueName: \"kubernetes.io/projected/f9931c40-80f2-4467-a708-0b2fa69e05e0-kube-api-access-9nfcx\") pod \"octavia-operator-controller-manager-76fcc6dc7c-jq9pq\" (UID: \"f9931c40-80f2-4467-a708-0b2fa69e05e0\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.176228 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmwpn\" (UniqueName: \"kubernetes.io/projected/4159cab8-0891-49ef-9293-425697a48dcd-kube-api-access-tmwpn\") pod \"mariadb-operator-controller-manager-88c7-vkxhv\" (UID: \"4159cab8-0891-49ef-9293-425697a48dcd\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.190484 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.191750 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.199295 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.200972 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndb78\" (UniqueName: \"kubernetes.io/projected/2ac6e9ef-b559-4a14-861a-84c9a2308e06-kube-api-access-ndb78\") pod \"neutron-operator-controller-manager-64d7b59854-xczb4\" (UID: \"2ac6e9ef-b559-4a14-861a-84c9a2308e06\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.202104 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghwn\" (UniqueName: \"kubernetes.io/projected/f508c857-c282-4b4e-9d93-e7358f3aad9d-kube-api-access-bghwn\") pod \"nova-operator-controller-manager-c7c776c96-lxkkz\" (UID: \"f508c857-c282-4b4e-9d93-e7358f3aad9d\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.203006 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mtn4r" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.204897 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.227778 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.229189 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.230424 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.236080 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fv5fm" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.237256 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.238168 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.252960 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqvwv\" (UniqueName: \"kubernetes.io/projected/ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92-kube-api-access-pqvwv\") pod \"swift-operator-controller-manager-bc7dc7bd9-mqc9x\" (UID: \"ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.253135 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb445\" (UniqueName: \"kubernetes.io/projected/d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a-kube-api-access-gb445\") pod \"placement-operator-controller-manager-589c58c6c-jm22c\" (UID: \"d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.253206 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq\" (UID: \"72272c30-dd3c-4779-bea8-e65f9e5ecd06\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.253528 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gdl\" (UniqueName: \"kubernetes.io/projected/72272c30-dd3c-4779-bea8-e65f9e5ecd06-kube-api-access-b5gdl\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq\" (UID: \"72272c30-dd3c-4779-bea8-e65f9e5ecd06\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.253588 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzww7\" (UniqueName: \"kubernetes.io/projected/9942a15a-9f22-465e-8c11-ce97e741f65f-kube-api-access-tzww7\") pod \"telemetry-operator-controller-manager-7bdb6cfb74-8jxt9\" (UID: \"9942a15a-9f22-465e-8c11-ce97e741f65f\") " pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.253676 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nfcx\" (UniqueName: \"kubernetes.io/projected/f9931c40-80f2-4467-a708-0b2fa69e05e0-kube-api-access-9nfcx\") pod \"octavia-operator-controller-manager-76fcc6dc7c-jq9pq\" (UID: \"f9931c40-80f2-4467-a708-0b2fa69e05e0\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.253704 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8lng\" (UniqueName: \"kubernetes.io/projected/b521c8df-d702-482e-b6cc-d04577dc0936-kube-api-access-g8lng\") pod \"ovn-operator-controller-manager-9976ff44c-xnttg\" (UID: \"b521c8df-d702-482e-b6cc-d04577dc0936\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" Sep 30 19:02:21 crc kubenswrapper[4747]: E0930 19:02:21.256076 4747 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 19:02:21 crc kubenswrapper[4747]: E0930 19:02:21.256558 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert podName:72272c30-dd3c-4779-bea8-e65f9e5ecd06 nodeName:}" failed. No retries permitted until 2025-09-30 19:02:21.756541978 +0000 UTC m=+981.416022092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert") pod "openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" (UID: "72272c30-dd3c-4779-bea8-e65f9e5ecd06") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.281830 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb445\" (UniqueName: \"kubernetes.io/projected/d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a-kube-api-access-gb445\") pod \"placement-operator-controller-manager-589c58c6c-jm22c\" (UID: \"d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.293547 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nfcx\" (UniqueName: \"kubernetes.io/projected/f9931c40-80f2-4467-a708-0b2fa69e05e0-kube-api-access-9nfcx\") pod \"octavia-operator-controller-manager-76fcc6dc7c-jq9pq\" (UID: \"f9931c40-80f2-4467-a708-0b2fa69e05e0\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.303602 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.305074 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.313371 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7xwtj" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.313663 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8lng\" (UniqueName: \"kubernetes.io/projected/b521c8df-d702-482e-b6cc-d04577dc0936-kube-api-access-g8lng\") pod \"ovn-operator-controller-manager-9976ff44c-xnttg\" (UID: \"b521c8df-d702-482e-b6cc-d04577dc0936\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.313667 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.314441 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gdl\" (UniqueName: \"kubernetes.io/projected/72272c30-dd3c-4779-bea8-e65f9e5ecd06-kube-api-access-b5gdl\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq\" (UID: \"72272c30-dd3c-4779-bea8-e65f9e5ecd06\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.320306 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.333853 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.357604 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzww7\" (UniqueName: \"kubernetes.io/projected/9942a15a-9f22-465e-8c11-ce97e741f65f-kube-api-access-tzww7\") pod \"telemetry-operator-controller-manager-7bdb6cfb74-8jxt9\" (UID: \"9942a15a-9f22-465e-8c11-ce97e741f65f\") " pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.357667 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdgbg\" (UniqueName: \"kubernetes.io/projected/f53a4663-01e9-42d3-8dc5-f708734dfe6c-kube-api-access-zdgbg\") pod \"test-operator-controller-manager-f66b554c6-pcbjx\" (UID: \"f53a4663-01e9-42d3-8dc5-f708734dfe6c\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.357699 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqvwv\" (UniqueName: \"kubernetes.io/projected/ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92-kube-api-access-pqvwv\") pod \"swift-operator-controller-manager-bc7dc7bd9-mqc9x\" (UID: \"ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.357776 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmfn6\" (UniqueName: \"kubernetes.io/projected/a60ae9a4-d39a-466a-9b2a-6122270c4614-kube-api-access-pmfn6\") pod \"watcher-operator-controller-manager-76669f99c-2rskh\" (UID: \"a60ae9a4-d39a-466a-9b2a-6122270c4614\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.359377 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.393119 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.394462 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.394994 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.397718 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-99p7k" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.403746 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzww7\" (UniqueName: \"kubernetes.io/projected/9942a15a-9f22-465e-8c11-ce97e741f65f-kube-api-access-tzww7\") pod \"telemetry-operator-controller-manager-7bdb6cfb74-8jxt9\" (UID: \"9942a15a-9f22-465e-8c11-ce97e741f65f\") " pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.403891 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqvwv\" (UniqueName: \"kubernetes.io/projected/ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92-kube-api-access-pqvwv\") pod \"swift-operator-controller-manager-bc7dc7bd9-mqc9x\" (UID: \"ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.410481 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.417044 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.422897 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.459081 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdgbg\" (UniqueName: \"kubernetes.io/projected/f53a4663-01e9-42d3-8dc5-f708734dfe6c-kube-api-access-zdgbg\") pod \"test-operator-controller-manager-f66b554c6-pcbjx\" (UID: \"f53a4663-01e9-42d3-8dc5-f708734dfe6c\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.459184 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbfqh\" (UniqueName: \"kubernetes.io/projected/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-kube-api-access-jbfqh\") pod \"openstack-operator-controller-manager-5468b64689-vzw94\" (UID: \"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.459237 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmfn6\" (UniqueName: \"kubernetes.io/projected/a60ae9a4-d39a-466a-9b2a-6122270c4614-kube-api-access-pmfn6\") pod \"watcher-operator-controller-manager-76669f99c-2rskh\" (UID: \"a60ae9a4-d39a-466a-9b2a-6122270c4614\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.459276 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert\") pod \"openstack-operator-controller-manager-5468b64689-vzw94\" (UID: \"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.470957 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.494382 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdgbg\" (UniqueName: \"kubernetes.io/projected/f53a4663-01e9-42d3-8dc5-f708734dfe6c-kube-api-access-zdgbg\") pod \"test-operator-controller-manager-f66b554c6-pcbjx\" (UID: \"f53a4663-01e9-42d3-8dc5-f708734dfe6c\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.497794 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmfn6\" (UniqueName: \"kubernetes.io/projected/a60ae9a4-d39a-466a-9b2a-6122270c4614-kube-api-access-pmfn6\") pod \"watcher-operator-controller-manager-76669f99c-2rskh\" (UID: \"a60ae9a4-d39a-466a-9b2a-6122270c4614\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.537831 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.563720 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert\") pod \"openstack-operator-controller-manager-5468b64689-vzw94\" (UID: \"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.563773 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn5cf\" (UniqueName: \"kubernetes.io/projected/6a934ea5-7f7f-4274-aac9-135961df32a6-kube-api-access-mn5cf\") pod \"rabbitmq-cluster-operator-manager-79d8469568-f528p\" (UID: \"6a934ea5-7f7f-4274-aac9-135961df32a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.563846 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbfqh\" (UniqueName: \"kubernetes.io/projected/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-kube-api-access-jbfqh\") pod \"openstack-operator-controller-manager-5468b64689-vzw94\" (UID: \"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:21 crc kubenswrapper[4747]: E0930 19:02:21.564290 4747 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 19:02:21 crc kubenswrapper[4747]: E0930 19:02:21.564339 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert podName:eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0 nodeName:}" failed. No retries permitted until 2025-09-30 19:02:22.064321926 +0000 UTC m=+981.723802040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert") pod "openstack-operator-controller-manager-5468b64689-vzw94" (UID: "eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0") : secret "webhook-server-cert" not found Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.606019 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbfqh\" (UniqueName: \"kubernetes.io/projected/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-kube-api-access-jbfqh\") pod \"openstack-operator-controller-manager-5468b64689-vzw94\" (UID: \"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.665529 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn5cf\" (UniqueName: \"kubernetes.io/projected/6a934ea5-7f7f-4274-aac9-135961df32a6-kube-api-access-mn5cf\") pod \"rabbitmq-cluster-operator-manager-79d8469568-f528p\" (UID: \"6a934ea5-7f7f-4274-aac9-135961df32a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.688554 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn5cf\" (UniqueName: \"kubernetes.io/projected/6a934ea5-7f7f-4274-aac9-135961df32a6-kube-api-access-mn5cf\") pod \"rabbitmq-cluster-operator-manager-79d8469568-f528p\" (UID: \"6a934ea5-7f7f-4274-aac9-135961df32a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.741163 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2"] Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.767395 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq\" (UID: \"72272c30-dd3c-4779-bea8-e65f9e5ecd06\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:21 crc kubenswrapper[4747]: E0930 19:02:21.767749 4747 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 19:02:21 crc kubenswrapper[4747]: E0930 19:02:21.767802 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert podName:72272c30-dd3c-4779-bea8-e65f9e5ecd06 nodeName:}" failed. No retries permitted until 2025-09-30 19:02:22.767786189 +0000 UTC m=+982.427266303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert") pod "openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" (UID: "72272c30-dd3c-4779-bea8-e65f9e5ecd06") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.787702 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.925113 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" Sep 30 19:02:21 crc kubenswrapper[4747]: I0930 19:02:21.987496 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.075754 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert\") pod \"openstack-operator-controller-manager-5468b64689-vzw94\" (UID: \"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:22 crc kubenswrapper[4747]: E0930 19:02:22.075936 4747 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 19:02:22 crc kubenswrapper[4747]: E0930 19:02:22.075985 4747 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert podName:eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0 nodeName:}" failed. No retries permitted until 2025-09-30 19:02:23.075970909 +0000 UTC m=+982.735451023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert") pod "openstack-operator-controller-manager-5468b64689-vzw94" (UID: "eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0") : secret "webhook-server-cert" not found Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.176638 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" event={"ID":"0e3fd103-b88a-4289-a0f3-959c5d1de5d3","Type":"ContainerStarted","Data":"e70e6e0ad4bb867a08bad0d8cb56b71a6945bc23e5fc3446ed95f7d9ced16723"} Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.182543 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" event={"ID":"b99836bf-ec49-4751-a4ab-0656ad583daf","Type":"ContainerStarted","Data":"c4021466910b6aa8366882ea2d71f03e271efe1434ad59cfb75e7518bd964b16"} Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.544386 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.547334 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.557504 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.567634 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.791426 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq\" (UID: \"72272c30-dd3c-4779-bea8-e65f9e5ecd06\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.799625 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72272c30-dd3c-4779-bea8-e65f9e5ecd06-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq\" (UID: \"72272c30-dd3c-4779-bea8-e65f9e5ecd06\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.843941 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.860552 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.865682 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.869690 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.877869 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.892147 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.898829 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj"] Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.911776 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq"] Sep 30 19:02:22 crc kubenswrapper[4747]: W0930 19:02:22.949966 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda47eadd_a352_46c7_84aa_fbc25bfac106.slice/crio-0a8e46d55ddf17ab8a62a610cade66d52f757d5fbfcff01be07e0f5fe1bdd09a WatchSource:0}: Error finding container 0a8e46d55ddf17ab8a62a610cade66d52f757d5fbfcff01be07e0f5fe1bdd09a: Status 404 returned error can't find the container with id 0a8e46d55ddf17ab8a62a610cade66d52f757d5fbfcff01be07e0f5fe1bdd09a Sep 30 19:02:22 crc kubenswrapper[4747]: E0930 19:02:22.959317 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dwhdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6d68dbc695-t4kgw_openstack-operators(da47eadd-a352-46c7-84aa-fbc25bfac106): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.961185 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c"] Sep 30 19:02:22 crc kubenswrapper[4747]: E0930 19:02:22.977687 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pmfn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76669f99c-2rskh_openstack-operators(a60ae9a4-d39a-466a-9b2a-6122270c4614): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:02:22 crc kubenswrapper[4747]: E0930 19:02:22.978344 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gb445,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-jm22c_openstack-operators(d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:02:22 crc kubenswrapper[4747]: I0930 19:02:22.997338 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw"] Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.003450 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4"] Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.012097 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8lng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-xnttg_openstack-operators(b521c8df-d702-482e-b6cc-d04577dc0936): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.013793 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9"] Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.019744 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh"] Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.024658 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx"] Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.026499 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:ac359d938872c47e1f3d7d8466b12f9d1f8a5236,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tzww7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7bdb6cfb74-8jxt9_openstack-operators(9942a15a-9f22-465e-8c11-ce97e741f65f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.032196 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg"] Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.038679 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p"] Sep 30 19:02:23 crc kubenswrapper[4747]: W0930 19:02:23.039679 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf53a4663_01e9_42d3_8dc5_f708734dfe6c.slice/crio-19528b305abf3c2caf8875d7cac49e28e943dc9d0fee19b4f8df318ebdb89a84 WatchSource:0}: Error finding container 19528b305abf3c2caf8875d7cac49e28e943dc9d0fee19b4f8df318ebdb89a84: Status 404 returned error can't find the container with id 19528b305abf3c2caf8875d7cac49e28e943dc9d0fee19b4f8df318ebdb89a84 Sep 30 19:02:23 crc kubenswrapper[4747]: W0930 19:02:23.048119 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a934ea5_7f7f_4274_aac9_135961df32a6.slice/crio-095de5658bb7c55b70cb824beacee889c20ec75057ce86718de8e10c9836eb89 WatchSource:0}: Error finding container 095de5658bb7c55b70cb824beacee889c20ec75057ce86718de8e10c9836eb89: Status 404 returned error can't find the container with id 095de5658bb7c55b70cb824beacee889c20ec75057ce86718de8e10c9836eb89 Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.070136 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mn5cf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-f528p_openstack-operators(6a934ea5-7f7f-4274-aac9-135961df32a6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.070518 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zdgbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-f66b554c6-pcbjx_openstack-operators(f53a4663-01e9-42d3-8dc5-f708734dfe6c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.086873 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" podUID="6a934ea5-7f7f-4274-aac9-135961df32a6" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.098297 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert\") pod \"openstack-operator-controller-manager-5468b64689-vzw94\" (UID: \"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.111589 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0-cert\") pod \"openstack-operator-controller-manager-5468b64689-vzw94\" (UID: \"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.209333 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" event={"ID":"eeb2c9bd-6467-4105-a4b9-800c69b815d0","Type":"ContainerStarted","Data":"16d5b2fcbea91907a9d4e6e4c46235fd5b83dcf0ba2ea98020ffa3d4934271db"} Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.210590 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" podUID="a60ae9a4-d39a-466a-9b2a-6122270c4614" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.216197 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" event={"ID":"dc5a3b8a-b0b5-40d5-a3b8-806105406c70","Type":"ContainerStarted","Data":"69170f2b9873785eedc034a354129792545ba446eb595b17a1eba7b74417657f"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.224018 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" event={"ID":"da47eadd-a352-46c7-84aa-fbc25bfac106","Type":"ContainerStarted","Data":"0a8e46d55ddf17ab8a62a610cade66d52f757d5fbfcff01be07e0f5fe1bdd09a"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.225897 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" event={"ID":"8d0a9431-6171-433c-a100-8d36b93e3422","Type":"ContainerStarted","Data":"40af967856fb37ca48aea6cea0db970cbba667e5c01e7fd49042b1bd99395809"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.227860 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" event={"ID":"f925f84a-6e3d-45be-82cf-320f7654adb0","Type":"ContainerStarted","Data":"d778878c2d7c39cc28563b340bdef152b3387009657e9034bd6fe38df950df5a"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.229963 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" event={"ID":"6a934ea5-7f7f-4274-aac9-135961df32a6","Type":"ContainerStarted","Data":"095de5658bb7c55b70cb824beacee889c20ec75057ce86718de8e10c9836eb89"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.231361 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" event={"ID":"f508c857-c282-4b4e-9d93-e7358f3aad9d","Type":"ContainerStarted","Data":"aa9fd63fa05bb28a54befef905250ff5a56fb5c4feac7d946052a845441778c0"} Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.234575 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" podUID="6a934ea5-7f7f-4274-aac9-135961df32a6" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.235355 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" event={"ID":"f9931c40-80f2-4467-a708-0b2fa69e05e0","Type":"ContainerStarted","Data":"939332cd752890e8585a40c95ae3f91c03047363048f819969dcf325bab3668a"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.243477 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" event={"ID":"a60ae9a4-d39a-466a-9b2a-6122270c4614","Type":"ContainerStarted","Data":"7757542c96de7d0afd188f206a43daeaabf48b4fa6d4848f32a2f82ed7003caa"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.247991 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" event={"ID":"4159cab8-0891-49ef-9293-425697a48dcd","Type":"ContainerStarted","Data":"6ba9bc84c923cef1ab470f9c0e7a90c982905b7464cb00537f0e8d5e67bb2b64"} Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.249968 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" podUID="a60ae9a4-d39a-466a-9b2a-6122270c4614" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.250696 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" event={"ID":"ad81d435-bedc-463a-a8c6-346ab3b5ee5a","Type":"ContainerStarted","Data":"45de4768abd42fdcca1c9d7b800a024908f11d374a33306a9dcfc9aa7d2d37c3"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.252346 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" event={"ID":"2ac6e9ef-b559-4a14-861a-84c9a2308e06","Type":"ContainerStarted","Data":"9290c9217ad7477510276a2c5873343ecb12a97d4b6ec804777951ae0a543d4d"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.254730 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" event={"ID":"9942a15a-9f22-465e-8c11-ce97e741f65f","Type":"ContainerStarted","Data":"2b68caf20d1a250aa48a941cd1afacfc31b404038f0522413ed6b517f933e27d"} Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.267731 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" podUID="d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.273574 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" event={"ID":"9b526a9f-9938-4391-903c-6c5a861256d3","Type":"ContainerStarted","Data":"366a8ea36e831d6f08f4bad4e76564af6d68247472364c62bb5f5dc0b86a5eb8"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.278427 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" event={"ID":"ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92","Type":"ContainerStarted","Data":"15929a16c23ba3a3e7d123dbb3e2a160cab6bd764c20287f81f9282ff93eb7d2"} Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.279609 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" event={"ID":"48f86fea-23fe-488b-8429-7e97c676675b","Type":"ContainerStarted","Data":"abd0c856302a65cc1c7c0e94c275430900f87447fffcc5d4166e530a3a5c775a"} Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.280548 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" podUID="da47eadd-a352-46c7-84aa-fbc25bfac106" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.283829 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" event={"ID":"f53a4663-01e9-42d3-8dc5-f708734dfe6c","Type":"ContainerStarted","Data":"19528b305abf3c2caf8875d7cac49e28e943dc9d0fee19b4f8df318ebdb89a84"} Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.283879 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" podUID="9942a15a-9f22-465e-8c11-ce97e741f65f" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.286715 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" event={"ID":"d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a","Type":"ContainerStarted","Data":"41b0cb8818f940876a03ab2e3bea8e76402c6213902c59bb5b5c509d35fe309f"} Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.288367 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" podUID="d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.294569 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" event={"ID":"b521c8df-d702-482e-b6cc-d04577dc0936","Type":"ContainerStarted","Data":"553f2a3c7e69edf42cf6c493fff946a427a36a24e1b70b2ced805960e21ef6ec"} Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.330069 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" podUID="f53a4663-01e9-42d3-8dc5-f708734dfe6c" Sep 30 19:02:23 crc kubenswrapper[4747]: E0930 19:02:23.336793 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" podUID="b521c8df-d702-482e-b6cc-d04577dc0936" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.380654 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq"] Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.397658 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:23 crc kubenswrapper[4747]: I0930 19:02:23.861902 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94"] Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.379888 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" event={"ID":"da47eadd-a352-46c7-84aa-fbc25bfac106","Type":"ContainerStarted","Data":"cca248897936b4b9d1bb21ffec2f2d2284a26a47b8c1d1faf8235e03211a8ad7"} Sep 30 19:02:24 crc kubenswrapper[4747]: E0930 19:02:24.383379 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" podUID="da47eadd-a352-46c7-84aa-fbc25bfac106" Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.386950 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" event={"ID":"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0","Type":"ContainerStarted","Data":"e367dd4e5d1fb29734db09ab8b4f7fadbbf6ab198e64299978d60620e166dea0"} Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.387000 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" event={"ID":"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0","Type":"ContainerStarted","Data":"55c1751e10a16bfd97be2c496183f03016952789477712843fc9cfcfed3b0a97"} Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.391049 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" event={"ID":"72272c30-dd3c-4779-bea8-e65f9e5ecd06","Type":"ContainerStarted","Data":"4d2ff866d5025d82028dab7fb454ac484cbce5aea861b84bb4edf80089944329"} Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.406225 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" event={"ID":"9942a15a-9f22-465e-8c11-ce97e741f65f","Type":"ContainerStarted","Data":"3bb2f857b58bb2a312f414bf16890de3ce6980810ef8406467226a42043f4d12"} Sep 30 19:02:24 crc kubenswrapper[4747]: E0930 19:02:24.410261 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:ac359d938872c47e1f3d7d8466b12f9d1f8a5236\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" podUID="9942a15a-9f22-465e-8c11-ce97e741f65f" Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.412887 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" event={"ID":"f53a4663-01e9-42d3-8dc5-f708734dfe6c","Type":"ContainerStarted","Data":"3420bb169e2f5fc9f0e78c1b138d5b91b1b969ff6acd0906b34d5708b357f40a"} Sep 30 19:02:24 crc kubenswrapper[4747]: E0930 19:02:24.416425 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" podUID="f53a4663-01e9-42d3-8dc5-f708734dfe6c" Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.431688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" event={"ID":"d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a","Type":"ContainerStarted","Data":"57bf7c3d2f4d4d0077cd970a749e715e345e7615274713d9dd77da2026d94d3d"} Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.434415 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" event={"ID":"b521c8df-d702-482e-b6cc-d04577dc0936","Type":"ContainerStarted","Data":"afbaad92fbdabd93cc0d44abc85d58dee166383b37b692c049d82d3fa1929a3b"} Sep 30 19:02:24 crc kubenswrapper[4747]: E0930 19:02:24.439873 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" podUID="b521c8df-d702-482e-b6cc-d04577dc0936" Sep 30 19:02:24 crc kubenswrapper[4747]: E0930 19:02:24.452498 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" podUID="d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a" Sep 30 19:02:24 crc kubenswrapper[4747]: I0930 19:02:24.456272 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" event={"ID":"a60ae9a4-d39a-466a-9b2a-6122270c4614","Type":"ContainerStarted","Data":"54c58638870737551cfa5d3aa905a31c311b771e79a2ad4b2fee5dea007175bf"} Sep 30 19:02:24 crc kubenswrapper[4747]: E0930 19:02:24.457384 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" podUID="6a934ea5-7f7f-4274-aac9-135961df32a6" Sep 30 19:02:24 crc kubenswrapper[4747]: E0930 19:02:24.459782 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" podUID="a60ae9a4-d39a-466a-9b2a-6122270c4614" Sep 30 19:02:25 crc kubenswrapper[4747]: E0930 19:02:25.468070 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" podUID="f53a4663-01e9-42d3-8dc5-f708734dfe6c" Sep 30 19:02:25 crc kubenswrapper[4747]: E0930 19:02:25.468154 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" podUID="a60ae9a4-d39a-466a-9b2a-6122270c4614" Sep 30 19:02:25 crc kubenswrapper[4747]: E0930 19:02:25.468243 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:ac359d938872c47e1f3d7d8466b12f9d1f8a5236\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" podUID="9942a15a-9f22-465e-8c11-ce97e741f65f" Sep 30 19:02:25 crc kubenswrapper[4747]: E0930 19:02:25.468271 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" podUID="da47eadd-a352-46c7-84aa-fbc25bfac106" Sep 30 19:02:25 crc kubenswrapper[4747]: E0930 19:02:25.468302 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" podUID="d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a" Sep 30 19:02:25 crc kubenswrapper[4747]: E0930 19:02:25.469070 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" podUID="b521c8df-d702-482e-b6cc-d04577dc0936" Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.557033 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" event={"ID":"0e3fd103-b88a-4289-a0f3-959c5d1de5d3","Type":"ContainerStarted","Data":"93c22c89cb3c4d749be9d6bec977ea419494aad1b6306cab80df1b6093204409"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.563724 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" event={"ID":"f508c857-c282-4b4e-9d93-e7358f3aad9d","Type":"ContainerStarted","Data":"ba0a88af9c9bce80690037eaa4300ae33ab1a7eb27cde7b6b55a0870b67196dd"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.570728 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" event={"ID":"48f86fea-23fe-488b-8429-7e97c676675b","Type":"ContainerStarted","Data":"3f4af8534535b50e04c55088463263673fc7bc02cdee0f82c4bb249648d970bd"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.586441 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" event={"ID":"eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0","Type":"ContainerStarted","Data":"78f461b6f4ddb1122d1aa34d6e7a12a0b1cb4d3540dbd241f268b8739f290d8c"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.587753 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.594900 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.601394 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" event={"ID":"f925f84a-6e3d-45be-82cf-320f7654adb0","Type":"ContainerStarted","Data":"ed750f140e2babcbb388987187e3718cc731cdc2e97efe26363c6fbea040ca4c"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.604360 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" event={"ID":"9b526a9f-9938-4391-903c-6c5a861256d3","Type":"ContainerStarted","Data":"bf8e5740c7022df843e2e490bea690325c85ce2f273f23f956869c88e58983ef"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.607092 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" event={"ID":"4159cab8-0891-49ef-9293-425697a48dcd","Type":"ContainerStarted","Data":"d79d1ccc4c126bb87f84557bed3e7e71ec4b6fb7aa80a4c5b33c45291af872da"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.607884 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" event={"ID":"ad81d435-bedc-463a-a8c6-346ab3b5ee5a","Type":"ContainerStarted","Data":"8a220f7038af8d27d0c119dedeff44dd809408593734312becef315e19410f69"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.614563 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" event={"ID":"b99836bf-ec49-4751-a4ab-0656ad583daf","Type":"ContainerStarted","Data":"8c94dc7359249dc4b30b4d8f8bd0aa688edf077715079ff24d6f4d56de17c00e"} Sep 30 19:02:35 crc kubenswrapper[4747]: I0930 19:02:35.631128 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-vzw94" podStartSLOduration=14.6311048 podStartE2EDuration="14.6311048s" podCreationTimestamp="2025-09-30 19:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:02:35.630344138 +0000 UTC m=+995.289824242" watchObservedRunningTime="2025-09-30 19:02:35.6311048 +0000 UTC m=+995.290584924" Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.625051 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" event={"ID":"ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92","Type":"ContainerStarted","Data":"db924976ea3bd552d9e6ab31472bcc516b6f3607eef3832770de65e0416d3a29"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.627339 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" event={"ID":"eeb2c9bd-6467-4105-a4b9-800c69b815d0","Type":"ContainerStarted","Data":"50c79bc3a0eed0629c1da8c427364c0486596aca18eea28fa18f947cf96d6bcb"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.630438 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" event={"ID":"72272c30-dd3c-4779-bea8-e65f9e5ecd06","Type":"ContainerStarted","Data":"764c900d77d8b6a69fe09498fb6c0976fef09659134d9c24967015327474ef35"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.632502 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" event={"ID":"8d0a9431-6171-433c-a100-8d36b93e3422","Type":"ContainerStarted","Data":"0e6c8fd54f042ed266ae6c67b19833680c43f4fb0af76fb30fc559a7a77ea941"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.633914 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" event={"ID":"0e3fd103-b88a-4289-a0f3-959c5d1de5d3","Type":"ContainerStarted","Data":"6e14fe2191c8177bdc08ff3c8931b608b9f15a9043263617c9e79bbc87552c99"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.639579 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" event={"ID":"b99836bf-ec49-4751-a4ab-0656ad583daf","Type":"ContainerStarted","Data":"f28fb9e6827c73c14d71d54bdefe5542f1572f414801a4a70836a120edd79149"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.639733 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.642855 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" event={"ID":"2ac6e9ef-b559-4a14-861a-84c9a2308e06","Type":"ContainerStarted","Data":"f70263127d591262ba360b4e2dd92871cbb9177c776eb76000ac325cabb4ce5c"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.644895 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" event={"ID":"dc5a3b8a-b0b5-40d5-a3b8-806105406c70","Type":"ContainerStarted","Data":"4e52ceabc00b9e68e739f994c1b59e0dd400a2b15f8695c6fa5f2bbc0a2b0d4e"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.646839 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" event={"ID":"f9931c40-80f2-4467-a708-0b2fa69e05e0","Type":"ContainerStarted","Data":"b08c1a2d57a77b8376e58f64674032fa90d2cccefc55d4fa217d709a6ba311a6"} Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.655004 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" podStartSLOduration=3.6191549629999997 podStartE2EDuration="16.654983552s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:21.846255555 +0000 UTC m=+981.505735669" lastFinishedPulling="2025-09-30 19:02:34.882084114 +0000 UTC m=+994.541564258" observedRunningTime="2025-09-30 19:02:36.650750041 +0000 UTC m=+996.310230155" watchObservedRunningTime="2025-09-30 19:02:36.654983552 +0000 UTC m=+996.314463666" Sep 30 19:02:36 crc kubenswrapper[4747]: I0930 19:02:36.667721 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" podStartSLOduration=3.952521333 podStartE2EDuration="16.667705016s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.166344515 +0000 UTC m=+981.825824629" lastFinishedPulling="2025-09-30 19:02:34.881528168 +0000 UTC m=+994.541008312" observedRunningTime="2025-09-30 19:02:36.666916713 +0000 UTC m=+996.326396827" watchObservedRunningTime="2025-09-30 19:02:36.667705016 +0000 UTC m=+996.327185130" Sep 30 19:02:37 crc kubenswrapper[4747]: I0930 19:02:37.655672 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.676387 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" event={"ID":"ad81d435-bedc-463a-a8c6-346ab3b5ee5a","Type":"ContainerStarted","Data":"fef83bd5eff983e7785e28688491fda347ee45819cf209d38f418087cd534590"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.677283 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.678336 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" event={"ID":"dc5a3b8a-b0b5-40d5-a3b8-806105406c70","Type":"ContainerStarted","Data":"5875196b87b57e1c5a8805a16b34c00bb9c998475d501aaaa23509f4ee6046d6"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.679530 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" event={"ID":"eeb2c9bd-6467-4105-a4b9-800c69b815d0","Type":"ContainerStarted","Data":"979e4425c1b4c7a8e91f0b7f84ad511b89550fc6fe803dcf3bf65de3d446f792"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.679871 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.681097 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" event={"ID":"f925f84a-6e3d-45be-82cf-320f7654adb0","Type":"ContainerStarted","Data":"c13eecc21e2148d44a8aad446ccbccdfe0a1e190a2e58b3d841b070eb52e20cb"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.681442 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.683279 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" event={"ID":"f9931c40-80f2-4467-a708-0b2fa69e05e0","Type":"ContainerStarted","Data":"daf579eadc08c3b8c4e3c524e555f41f3f9a397a0519d3a5cf99247220e3a86e"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.683914 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.689913 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" event={"ID":"9b526a9f-9938-4391-903c-6c5a861256d3","Type":"ContainerStarted","Data":"25799d8937f75e1ac9b38abb4251f3f8c3bbe411eeadfb4997072a5e55ff535f"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.689966 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.691318 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" event={"ID":"2ac6e9ef-b559-4a14-861a-84c9a2308e06","Type":"ContainerStarted","Data":"138b9f095626311da44cc33524ae73691c736f9b7c7dc11f215627c0ded7c8c9"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.691594 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.694531 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" event={"ID":"72272c30-dd3c-4779-bea8-e65f9e5ecd06","Type":"ContainerStarted","Data":"cd52bad3e7fc56ae49da5c18d233b626c3b80263cc6889200ac3a918801adf98"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.694610 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.696379 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" event={"ID":"8d0a9431-6171-433c-a100-8d36b93e3422","Type":"ContainerStarted","Data":"013accdddbcdb2140bd9fd15d6816fc5962148d39c6425e12f140457909a63b0"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.697999 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" event={"ID":"ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92","Type":"ContainerStarted","Data":"52448316558e8850c75cae01c8e48ed9eecbea75356aae6ed1d45bc2996bbd61"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.698077 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.699793 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" event={"ID":"f508c857-c282-4b4e-9d93-e7358f3aad9d","Type":"ContainerStarted","Data":"91db5f0e44405bea69d4a658374423353281e2269df71501757fe4a26a874582"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.699934 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.701440 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" event={"ID":"48f86fea-23fe-488b-8429-7e97c676675b","Type":"ContainerStarted","Data":"ec50ff04e11f81f0d0e32251a48ff4f6c6ef80290d3f8cd2b379c10a10efa16b"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.701578 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.702681 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" event={"ID":"4159cab8-0891-49ef-9293-425697a48dcd","Type":"ContainerStarted","Data":"58e096f98fe308c3b8291651ddd4f06ebdc0532e8739b950a31b019a1065f54f"} Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.703095 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.742965 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" podStartSLOduration=8.264327127 podStartE2EDuration="19.742913632s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:23.439587352 +0000 UTC m=+983.099067456" lastFinishedPulling="2025-09-30 19:02:34.918173847 +0000 UTC m=+994.577653961" observedRunningTime="2025-09-30 19:02:39.737301832 +0000 UTC m=+999.396781946" watchObservedRunningTime="2025-09-30 19:02:39.742913632 +0000 UTC m=+999.402393746" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.746247 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" podStartSLOduration=7.71266136 podStartE2EDuration="19.746215627s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.900648929 +0000 UTC m=+982.560129043" lastFinishedPulling="2025-09-30 19:02:34.934203196 +0000 UTC m=+994.593683310" observedRunningTime="2025-09-30 19:02:39.702848986 +0000 UTC m=+999.362329100" watchObservedRunningTime="2025-09-30 19:02:39.746215627 +0000 UTC m=+999.405695741" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.781872 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" podStartSLOduration=7.482089151 podStartE2EDuration="19.781852957s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.582235616 +0000 UTC m=+982.241715730" lastFinishedPulling="2025-09-30 19:02:34.881999392 +0000 UTC m=+994.541479536" observedRunningTime="2025-09-30 19:02:39.779760817 +0000 UTC m=+999.439240951" watchObservedRunningTime="2025-09-30 19:02:39.781852957 +0000 UTC m=+999.441333071" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.784056 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" podStartSLOduration=7.684974878 podStartE2EDuration="19.7840474s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.901068561 +0000 UTC m=+982.560548675" lastFinishedPulling="2025-09-30 19:02:35.000141013 +0000 UTC m=+994.659621197" observedRunningTime="2025-09-30 19:02:39.760775324 +0000 UTC m=+999.420255438" watchObservedRunningTime="2025-09-30 19:02:39.7840474 +0000 UTC m=+999.443527514" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.804108 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" podStartSLOduration=7.49151264 podStartE2EDuration="19.804085233s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.569491231 +0000 UTC m=+982.228971345" lastFinishedPulling="2025-09-30 19:02:34.882063794 +0000 UTC m=+994.541543938" observedRunningTime="2025-09-30 19:02:39.799208563 +0000 UTC m=+999.458688687" watchObservedRunningTime="2025-09-30 19:02:39.804085233 +0000 UTC m=+999.463565347" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.818992 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" podStartSLOduration=7.796922231 podStartE2EDuration="19.818975359s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.913473386 +0000 UTC m=+982.572953500" lastFinishedPulling="2025-09-30 19:02:34.935526514 +0000 UTC m=+994.595006628" observedRunningTime="2025-09-30 19:02:39.818742813 +0000 UTC m=+999.478222927" watchObservedRunningTime="2025-09-30 19:02:39.818975359 +0000 UTC m=+999.478455473" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.841189 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" podStartSLOduration=7.803294163 podStartE2EDuration="19.841168794s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.901291937 +0000 UTC m=+982.560772051" lastFinishedPulling="2025-09-30 19:02:34.939166528 +0000 UTC m=+994.598646682" observedRunningTime="2025-09-30 19:02:39.833945808 +0000 UTC m=+999.493425952" watchObservedRunningTime="2025-09-30 19:02:39.841168794 +0000 UTC m=+999.500648908" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.853932 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" podStartSLOduration=7.844373489 podStartE2EDuration="19.853899569s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.924048118 +0000 UTC m=+982.583528232" lastFinishedPulling="2025-09-30 19:02:34.933574188 +0000 UTC m=+994.593054312" observedRunningTime="2025-09-30 19:02:39.847472445 +0000 UTC m=+999.506952559" watchObservedRunningTime="2025-09-30 19:02:39.853899569 +0000 UTC m=+999.513379683" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.886696 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" podStartSLOduration=7.839375306 podStartE2EDuration="19.886672337s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.923863073 +0000 UTC m=+982.583343187" lastFinishedPulling="2025-09-30 19:02:34.971160104 +0000 UTC m=+994.630640218" observedRunningTime="2025-09-30 19:02:39.87946505 +0000 UTC m=+999.538945164" watchObservedRunningTime="2025-09-30 19:02:39.886672337 +0000 UTC m=+999.546152451" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.902861 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" podStartSLOduration=7.817721526 podStartE2EDuration="19.902843599s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.900672439 +0000 UTC m=+982.560152553" lastFinishedPulling="2025-09-30 19:02:34.985794472 +0000 UTC m=+994.645274626" observedRunningTime="2025-09-30 19:02:39.899830573 +0000 UTC m=+999.559310687" watchObservedRunningTime="2025-09-30 19:02:39.902843599 +0000 UTC m=+999.562323713" Sep 30 19:02:39 crc kubenswrapper[4747]: I0930 19:02:39.930900 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" podStartSLOduration=7.981776712 podStartE2EDuration="19.930882872s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.957324221 +0000 UTC m=+982.616804335" lastFinishedPulling="2025-09-30 19:02:34.906430381 +0000 UTC m=+994.565910495" observedRunningTime="2025-09-30 19:02:39.927536556 +0000 UTC m=+999.587016670" watchObservedRunningTime="2025-09-30 19:02:39.930882872 +0000 UTC m=+999.590362986" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.712163 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717412 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-np9sf" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717462 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-xczb4" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717483 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-x5qln" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717505 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-mqc9x" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717524 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-4x7bb" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717544 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-vkxhv" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717563 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-lxkkz" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717586 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717604 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-jq9pq" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.717624 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9j6gj" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.718886 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-sfgn2" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.721037 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.729226 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-62mgn" podStartSLOduration=8.403642873 podStartE2EDuration="20.729205557s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.55825546 +0000 UTC m=+982.217735574" lastFinishedPulling="2025-09-30 19:02:34.883818104 +0000 UTC m=+994.543298258" observedRunningTime="2025-09-30 19:02:40.727778427 +0000 UTC m=+1000.387258541" watchObservedRunningTime="2025-09-30 19:02:40.729205557 +0000 UTC m=+1000.388685671" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.922329 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-g6kv2" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.937353 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrcwn" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.946651 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" podStartSLOduration=8.566142004 podStartE2EDuration="20.946618279s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.554440751 +0000 UTC m=+982.213920865" lastFinishedPulling="2025-09-30 19:02:34.934917016 +0000 UTC m=+994.594397140" observedRunningTime="2025-09-30 19:02:40.944601032 +0000 UTC m=+1000.604081146" watchObservedRunningTime="2025-09-30 19:02:40.946618279 +0000 UTC m=+1000.606098403" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.981108 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" Sep 30 19:02:40 crc kubenswrapper[4747]: I0930 19:02:40.984593 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-pzbbf" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.782173 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" event={"ID":"d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a","Type":"ContainerStarted","Data":"2a242a8a8ce3f64b65c445e2d7ef18ad4a5756d85854bf87be02bb113de5f821"} Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.785492 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.788184 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" event={"ID":"b521c8df-d702-482e-b6cc-d04577dc0936","Type":"ContainerStarted","Data":"2565053d6098b54c211506d6cc29d0961c6ea52e3ee058ca7c54524dc7ce5739"} Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.788991 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.793668 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" event={"ID":"a60ae9a4-d39a-466a-9b2a-6122270c4614","Type":"ContainerStarted","Data":"1363f33e93d950bdafe1ec3c4dd73513f8e3b6a7983e205ff0e384cfff9a3547"} Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.794015 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.795513 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" event={"ID":"da47eadd-a352-46c7-84aa-fbc25bfac106","Type":"ContainerStarted","Data":"841cb1ea2d041cc103a26565d7210839172875c5f2e0d7af3af0e348e146c640"} Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.795733 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.798322 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" event={"ID":"9942a15a-9f22-465e-8c11-ce97e741f65f","Type":"ContainerStarted","Data":"8905647e0c83dbb29b120922ab06cce8be2e8c2a2aac28ac2415ebcb8f71a8c4"} Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.799160 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.803465 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" event={"ID":"f53a4663-01e9-42d3-8dc5-f708734dfe6c","Type":"ContainerStarted","Data":"d75b5372862ac2a91cb747c4721b1427337b3fe022ffc81f8d36486439dda406"} Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.803697 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.807661 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" podStartSLOduration=3.334490444 podStartE2EDuration="25.807644503s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.976418187 +0000 UTC m=+982.635898301" lastFinishedPulling="2025-09-30 19:02:45.449572246 +0000 UTC m=+1005.109052360" observedRunningTime="2025-09-30 19:02:45.806081998 +0000 UTC m=+1005.465562132" watchObservedRunningTime="2025-09-30 19:02:45.807644503 +0000 UTC m=+1005.467124627" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.822864 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" podStartSLOduration=3.4590022080000002 podStartE2EDuration="25.822837388s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:23.011962184 +0000 UTC m=+982.671442298" lastFinishedPulling="2025-09-30 19:02:45.375797364 +0000 UTC m=+1005.035277478" observedRunningTime="2025-09-30 19:02:45.81975315 +0000 UTC m=+1005.479233274" watchObservedRunningTime="2025-09-30 19:02:45.822837388 +0000 UTC m=+1005.482317542" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.838874 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" podStartSLOduration=3.42204879 podStartE2EDuration="25.838853086s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.959077831 +0000 UTC m=+982.618557945" lastFinishedPulling="2025-09-30 19:02:45.375882087 +0000 UTC m=+1005.035362241" observedRunningTime="2025-09-30 19:02:45.83620202 +0000 UTC m=+1005.495682144" watchObservedRunningTime="2025-09-30 19:02:45.838853086 +0000 UTC m=+1005.498333210" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.876878 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" podStartSLOduration=3.525926993 podStartE2EDuration="25.876853784s" podCreationTimestamp="2025-09-30 19:02:20 +0000 UTC" firstStartedPulling="2025-09-30 19:02:23.025650446 +0000 UTC m=+982.685130560" lastFinishedPulling="2025-09-30 19:02:45.376577237 +0000 UTC m=+1005.036057351" observedRunningTime="2025-09-30 19:02:45.864881931 +0000 UTC m=+1005.524362045" watchObservedRunningTime="2025-09-30 19:02:45.876853784 +0000 UTC m=+1005.536333898" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.890527 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" podStartSLOduration=2.583954694 podStartE2EDuration="24.890504114s" podCreationTimestamp="2025-09-30 19:02:21 +0000 UTC" firstStartedPulling="2025-09-30 19:02:23.069918553 +0000 UTC m=+982.729398667" lastFinishedPulling="2025-09-30 19:02:45.376467943 +0000 UTC m=+1005.035948087" observedRunningTime="2025-09-30 19:02:45.881708883 +0000 UTC m=+1005.541188997" watchObservedRunningTime="2025-09-30 19:02:45.890504114 +0000 UTC m=+1005.549984228" Sep 30 19:02:45 crc kubenswrapper[4747]: I0930 19:02:45.923277 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" podStartSLOduration=2.493138445 podStartE2EDuration="24.923261582s" podCreationTimestamp="2025-09-30 19:02:21 +0000 UTC" firstStartedPulling="2025-09-30 19:02:22.976337455 +0000 UTC m=+982.635817559" lastFinishedPulling="2025-09-30 19:02:45.406460552 +0000 UTC m=+1005.065940696" observedRunningTime="2025-09-30 19:02:45.921199603 +0000 UTC m=+1005.580679717" watchObservedRunningTime="2025-09-30 19:02:45.923261582 +0000 UTC m=+1005.582741696" Sep 30 19:02:46 crc kubenswrapper[4747]: I0930 19:02:46.814173 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" event={"ID":"6a934ea5-7f7f-4274-aac9-135961df32a6","Type":"ContainerStarted","Data":"8da1f37d07db25461893ce5d66298df66b6ba5f7c3016544b73c753caad1fb7b"} Sep 30 19:02:46 crc kubenswrapper[4747]: I0930 19:02:46.838126 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-f528p" podStartSLOduration=3.502061349 podStartE2EDuration="25.838097203s" podCreationTimestamp="2025-09-30 19:02:21 +0000 UTC" firstStartedPulling="2025-09-30 19:02:23.069990305 +0000 UTC m=+982.729470419" lastFinishedPulling="2025-09-30 19:02:45.406026109 +0000 UTC m=+1005.065506273" observedRunningTime="2025-09-30 19:02:46.836809576 +0000 UTC m=+1006.496289750" watchObservedRunningTime="2025-09-30 19:02:46.838097203 +0000 UTC m=+1006.497577327" Sep 30 19:02:51 crc kubenswrapper[4747]: I0930 19:02:51.210077 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-t4kgw" Sep 30 19:02:51 crc kubenswrapper[4747]: I0930 19:02:51.399765 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-xnttg" Sep 30 19:02:51 crc kubenswrapper[4747]: I0930 19:02:51.415091 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jm22c" Sep 30 19:02:51 crc kubenswrapper[4747]: I0930 19:02:51.474743 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8jxt9" Sep 30 19:02:51 crc kubenswrapper[4747]: I0930 19:02:51.542813 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-2rskh" Sep 30 19:02:51 crc kubenswrapper[4747]: I0930 19:02:51.791835 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-pcbjx" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.100083 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6jj7d"] Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.102919 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.105686 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xcvc2" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.106490 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.106526 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.106914 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.120870 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6jj7d"] Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.164282 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ad5ca4-3275-4ab3-9818-c2afce879d04-config\") pod \"dnsmasq-dns-675f4bcbfc-6jj7d\" (UID: \"57ad5ca4-3275-4ab3-9818-c2afce879d04\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.164480 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njpjz\" (UniqueName: \"kubernetes.io/projected/57ad5ca4-3275-4ab3-9818-c2afce879d04-kube-api-access-njpjz\") pod \"dnsmasq-dns-675f4bcbfc-6jj7d\" (UID: \"57ad5ca4-3275-4ab3-9818-c2afce879d04\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.166032 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j7gc9"] Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.167594 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.173250 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.179048 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j7gc9"] Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.265266 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5ckf\" (UniqueName: \"kubernetes.io/projected/337c81e9-fe53-4437-8a46-3d10f4d195e1-kube-api-access-v5ckf\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.265318 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njpjz\" (UniqueName: \"kubernetes.io/projected/57ad5ca4-3275-4ab3-9818-c2afce879d04-kube-api-access-njpjz\") pod \"dnsmasq-dns-675f4bcbfc-6jj7d\" (UID: \"57ad5ca4-3275-4ab3-9818-c2afce879d04\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.265367 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ad5ca4-3275-4ab3-9818-c2afce879d04-config\") pod \"dnsmasq-dns-675f4bcbfc-6jj7d\" (UID: \"57ad5ca4-3275-4ab3-9818-c2afce879d04\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.265433 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-config\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.265481 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.266541 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ad5ca4-3275-4ab3-9818-c2afce879d04-config\") pod \"dnsmasq-dns-675f4bcbfc-6jj7d\" (UID: \"57ad5ca4-3275-4ab3-9818-c2afce879d04\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.297776 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njpjz\" (UniqueName: \"kubernetes.io/projected/57ad5ca4-3275-4ab3-9818-c2afce879d04-kube-api-access-njpjz\") pod \"dnsmasq-dns-675f4bcbfc-6jj7d\" (UID: \"57ad5ca4-3275-4ab3-9818-c2afce879d04\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.366567 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5ckf\" (UniqueName: \"kubernetes.io/projected/337c81e9-fe53-4437-8a46-3d10f4d195e1-kube-api-access-v5ckf\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.366662 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-config\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.366687 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.367704 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.367824 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-config\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.392639 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5ckf\" (UniqueName: \"kubernetes.io/projected/337c81e9-fe53-4437-8a46-3d10f4d195e1-kube-api-access-v5ckf\") pod \"dnsmasq-dns-78dd6ddcc-j7gc9\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.424221 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.486857 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.649842 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6jj7d"] Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.664110 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:03:08 crc kubenswrapper[4747]: I0930 19:03:08.709619 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j7gc9"] Sep 30 19:03:08 crc kubenswrapper[4747]: W0930 19:03:08.717385 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod337c81e9_fe53_4437_8a46_3d10f4d195e1.slice/crio-79da587b09841fed0f812202b56e255b30abd30b0e200e166ab4c777eb4f39d6 WatchSource:0}: Error finding container 79da587b09841fed0f812202b56e255b30abd30b0e200e166ab4c777eb4f39d6: Status 404 returned error can't find the container with id 79da587b09841fed0f812202b56e255b30abd30b0e200e166ab4c777eb4f39d6 Sep 30 19:03:09 crc kubenswrapper[4747]: I0930 19:03:09.030500 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" event={"ID":"57ad5ca4-3275-4ab3-9818-c2afce879d04","Type":"ContainerStarted","Data":"021008d95c1965ab5a86221d156b4ede739850c0905efbedce98c361f308545d"} Sep 30 19:03:09 crc kubenswrapper[4747]: I0930 19:03:09.032824 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" event={"ID":"337c81e9-fe53-4437-8a46-3d10f4d195e1","Type":"ContainerStarted","Data":"79da587b09841fed0f812202b56e255b30abd30b0e200e166ab4c777eb4f39d6"} Sep 30 19:03:10 crc kubenswrapper[4747]: I0930 19:03:10.693232 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6jj7d"] Sep 30 19:03:10 crc kubenswrapper[4747]: I0930 19:03:10.713244 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nqz9b"] Sep 30 19:03:10 crc kubenswrapper[4747]: I0930 19:03:10.714456 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:10 crc kubenswrapper[4747]: I0930 19:03:10.725148 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nqz9b"] Sep 30 19:03:10 crc kubenswrapper[4747]: I0930 19:03:10.902563 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:10 crc kubenswrapper[4747]: I0930 19:03:10.902607 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdtf8\" (UniqueName: \"kubernetes.io/projected/7f34d2ac-ba27-42df-93dd-a7803fb7220c-kube-api-access-pdtf8\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:10 crc kubenswrapper[4747]: I0930 19:03:10.902644 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-config\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:10 crc kubenswrapper[4747]: I0930 19:03:10.987519 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j7gc9"] Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.003892 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.003973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdtf8\" (UniqueName: \"kubernetes.io/projected/7f34d2ac-ba27-42df-93dd-a7803fb7220c-kube-api-access-pdtf8\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.004018 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-config\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.004853 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.005168 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-config\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.015298 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-77gkx"] Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.016522 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.027753 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-77gkx"] Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.031232 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdtf8\" (UniqueName: \"kubernetes.io/projected/7f34d2ac-ba27-42df-93dd-a7803fb7220c-kube-api-access-pdtf8\") pod \"dnsmasq-dns-666b6646f7-nqz9b\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.033565 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.107343 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-config\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.107411 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.107448 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxp4m\" (UniqueName: \"kubernetes.io/projected/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-kube-api-access-lxp4m\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.209674 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-config\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.209732 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.209761 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxp4m\" (UniqueName: \"kubernetes.io/projected/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-kube-api-access-lxp4m\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.210824 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-config\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.211389 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.229482 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxp4m\" (UniqueName: \"kubernetes.io/projected/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-kube-api-access-lxp4m\") pod \"dnsmasq-dns-57d769cc4f-77gkx\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.358306 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.512469 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nqz9b"] Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.875278 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.883087 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.885038 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dfs52" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.885457 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.887225 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.887373 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.887677 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.887801 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.888619 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 19:03:11 crc kubenswrapper[4747]: I0930 19:03:11.895681 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018623 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-server-conf\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018671 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018703 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018727 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018749 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018767 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/008f0030-d04c-427c-bc09-76874ee17b16-pod-info\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018884 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b72fk\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-kube-api-access-b72fk\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018916 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018955 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.018973 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-config-data\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.019009 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/008f0030-d04c-427c-bc09-76874ee17b16-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120387 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120451 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120469 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-config-data\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120498 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/008f0030-d04c-427c-bc09-76874ee17b16-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120559 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-server-conf\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120578 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120605 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120625 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120642 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120661 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/008f0030-d04c-427c-bc09-76874ee17b16-pod-info\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120686 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b72fk\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-kube-api-access-b72fk\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.120992 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.121846 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.122319 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-config-data\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.122541 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.122576 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.122815 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/008f0030-d04c-427c-bc09-76874ee17b16-server-conf\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.127298 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/008f0030-d04c-427c-bc09-76874ee17b16-pod-info\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.127321 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/008f0030-d04c-427c-bc09-76874ee17b16-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.127729 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.138784 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b72fk\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-kube-api-access-b72fk\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.139461 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/008f0030-d04c-427c-bc09-76874ee17b16-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.147687 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"008f0030-d04c-427c-bc09-76874ee17b16\") " pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.174260 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.175532 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.180686 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.180969 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.181252 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jvs6r" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.181591 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.181801 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.181974 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.187565 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.188450 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.223476 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.323373 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.323460 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.323538 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.323603 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.323626 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.323663 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.323677 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.323976 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrrm\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-kube-api-access-tgrrm\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.324013 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ae06b6d-5f75-46a4-8805-0d99f3771c71-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.324033 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.324066 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ae06b6d-5f75-46a4-8805-0d99f3771c71-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.429653 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrrm\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-kube-api-access-tgrrm\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.429772 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ae06b6d-5f75-46a4-8805-0d99f3771c71-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.429806 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.429968 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ae06b6d-5f75-46a4-8805-0d99f3771c71-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.430073 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.430121 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.430171 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.430246 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.430272 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.430304 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.430326 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.431013 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.431402 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.431405 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.431556 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.431732 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.432077 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ae06b6d-5f75-46a4-8805-0d99f3771c71-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.433755 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ae06b6d-5f75-46a4-8805-0d99f3771c71-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.433758 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ae06b6d-5f75-46a4-8805-0d99f3771c71-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.448541 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.452005 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrrm\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-kube-api-access-tgrrm\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.452841 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ae06b6d-5f75-46a4-8805-0d99f3771c71-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.463793 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ae06b6d-5f75-46a4-8805-0d99f3771c71\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:12 crc kubenswrapper[4747]: I0930 19:03:12.511020 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.227504 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-77gkx"] Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.373586 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.375050 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.381823 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.381900 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-q62md" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.382069 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.382800 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.384064 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.386085 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.404876 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.471770 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.471818 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-config-data-default\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.471847 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-kolla-config\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.471873 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-secrets\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.471915 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.471954 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxpg4\" (UniqueName: \"kubernetes.io/projected/6ffa91d2-f90c-4b61-be02-28351b9d7d30-kube-api-access-pxpg4\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.471972 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.471987 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.472014 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6ffa91d2-f90c-4b61-be02-28351b9d7d30-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573623 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573677 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxpg4\" (UniqueName: \"kubernetes.io/projected/6ffa91d2-f90c-4b61-be02-28351b9d7d30-kube-api-access-pxpg4\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573699 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573716 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573747 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6ffa91d2-f90c-4b61-be02-28351b9d7d30-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573794 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573820 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-config-data-default\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573843 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-kolla-config\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.573866 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-secrets\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.574487 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.574975 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6ffa91d2-f90c-4b61-be02-28351b9d7d30-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.575276 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-config-data-default\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.575573 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-kolla-config\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.575944 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffa91d2-f90c-4b61-be02-28351b9d7d30-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.580705 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.582492 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.589378 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxpg4\" (UniqueName: \"kubernetes.io/projected/6ffa91d2-f90c-4b61-be02-28351b9d7d30-kube-api-access-pxpg4\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.594650 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.604029 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6ffa91d2-f90c-4b61-be02-28351b9d7d30-secrets\") pod \"openstack-galera-0\" (UID: \"6ffa91d2-f90c-4b61-be02-28351b9d7d30\") " pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.709020 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.755030 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.756271 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.758107 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.758116 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dzpmv" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.758267 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.759323 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.772588 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 19:03:14 crc kubenswrapper[4747]: W0930 19:03:14.818647 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf275f9b3_a8e7_4513_ae6c_dd1683bd6f61.slice/crio-16bbbf7af3f6f49bf3e9b321490957b1db74df22ec2a2613db32b27ba45a3680 WatchSource:0}: Error finding container 16bbbf7af3f6f49bf3e9b321490957b1db74df22ec2a2613db32b27ba45a3680: Status 404 returned error can't find the container with id 16bbbf7af3f6f49bf3e9b321490957b1db74df22ec2a2613db32b27ba45a3680 Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.877833 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.877886 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.877906 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.877943 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.877989 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.878023 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.878090 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7vpn\" (UniqueName: \"kubernetes.io/projected/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-kube-api-access-f7vpn\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.878164 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.878190 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.979701 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980051 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980068 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980086 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980100 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980138 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980155 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980173 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7vpn\" (UniqueName: \"kubernetes.io/projected/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-kube-api-access-f7vpn\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980226 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.980371 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.981008 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.981035 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.981853 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.983255 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.986598 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.986677 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:14 crc kubenswrapper[4747]: I0930 19:03:14.989554 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.008452 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7vpn\" (UniqueName: \"kubernetes.io/projected/2b7e2ab9-2510-47d9-b1c6-3562b1c968be-kube-api-access-f7vpn\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.019921 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2b7e2ab9-2510-47d9-b1c6-3562b1c968be\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.119570 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" event={"ID":"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61","Type":"ContainerStarted","Data":"16bbbf7af3f6f49bf3e9b321490957b1db74df22ec2a2613db32b27ba45a3680"} Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.120767 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" event={"ID":"7f34d2ac-ba27-42df-93dd-a7803fb7220c","Type":"ContainerStarted","Data":"250741bc0496a40033408ded9ea95ebf7dcf77f6c77322e055413118f2fb631d"} Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.137530 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.291261 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.294674 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.298733 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-ckdxr" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.298793 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.300088 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.306843 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.388892 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.388965 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-config-data\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.388991 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.389027 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2kwr\" (UniqueName: \"kubernetes.io/projected/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-kube-api-access-t2kwr\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.389090 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-kolla-config\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.490599 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-kolla-config\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.490651 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.490685 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-config-data\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.490704 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.490739 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2kwr\" (UniqueName: \"kubernetes.io/projected/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-kube-api-access-t2kwr\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.491430 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-kolla-config\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.491580 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-config-data\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.494835 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.507498 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.510852 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2kwr\" (UniqueName: \"kubernetes.io/projected/cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6-kube-api-access-t2kwr\") pod \"memcached-0\" (UID: \"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6\") " pod="openstack/memcached-0" Sep 30 19:03:15 crc kubenswrapper[4747]: I0930 19:03:15.611753 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.438830 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xgjfm"] Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.440593 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.444623 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nfb9r" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.444880 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.445011 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.448226 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9nf99"] Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.451357 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.460987 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xgjfm"] Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.467562 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9nf99"] Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.514874 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-etc-ovs\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.514934 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-run\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515057 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-run-ovn\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515130 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7211e36c-ce8d-434e-9952-e5f5eb7097ec-scripts\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515292 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7211e36c-ce8d-434e-9952-e5f5eb7097ec-ovn-controller-tls-certs\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515352 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-run\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515397 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-lib\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515457 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e0750a-cba0-4fc3-8ff5-9ed716525dee-scripts\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515480 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-log-ovn\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515506 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7211e36c-ce8d-434e-9952-e5f5eb7097ec-combined-ca-bundle\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515623 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-log\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515651 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggf7v\" (UniqueName: \"kubernetes.io/projected/7211e36c-ce8d-434e-9952-e5f5eb7097ec-kube-api-access-ggf7v\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.515682 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx7mn\" (UniqueName: \"kubernetes.io/projected/24e0750a-cba0-4fc3-8ff5-9ed716525dee-kube-api-access-dx7mn\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.616811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-log\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.616861 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggf7v\" (UniqueName: \"kubernetes.io/projected/7211e36c-ce8d-434e-9952-e5f5eb7097ec-kube-api-access-ggf7v\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.616884 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx7mn\" (UniqueName: \"kubernetes.io/projected/24e0750a-cba0-4fc3-8ff5-9ed716525dee-kube-api-access-dx7mn\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.616910 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-etc-ovs\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.616944 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-run\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.616977 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-run-ovn\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.616995 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7211e36c-ce8d-434e-9952-e5f5eb7097ec-scripts\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617023 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7211e36c-ce8d-434e-9952-e5f5eb7097ec-ovn-controller-tls-certs\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617043 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-run\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617062 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-lib\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617081 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e0750a-cba0-4fc3-8ff5-9ed716525dee-scripts\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617096 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-log-ovn\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617110 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7211e36c-ce8d-434e-9952-e5f5eb7097ec-combined-ca-bundle\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617710 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-etc-ovs\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617812 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-log-ovn\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617792 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-run-ovn\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.617908 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7211e36c-ce8d-434e-9952-e5f5eb7097ec-var-run\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.618247 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-lib\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.619179 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-run\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.620252 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7211e36c-ce8d-434e-9952-e5f5eb7097ec-scripts\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.620342 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/24e0750a-cba0-4fc3-8ff5-9ed716525dee-var-log\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.621516 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e0750a-cba0-4fc3-8ff5-9ed716525dee-scripts\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.623115 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7211e36c-ce8d-434e-9952-e5f5eb7097ec-combined-ca-bundle\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.627940 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7211e36c-ce8d-434e-9952-e5f5eb7097ec-ovn-controller-tls-certs\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.634897 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggf7v\" (UniqueName: \"kubernetes.io/projected/7211e36c-ce8d-434e-9952-e5f5eb7097ec-kube-api-access-ggf7v\") pod \"ovn-controller-xgjfm\" (UID: \"7211e36c-ce8d-434e-9952-e5f5eb7097ec\") " pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.637517 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx7mn\" (UniqueName: \"kubernetes.io/projected/24e0750a-cba0-4fc3-8ff5-9ed716525dee-kube-api-access-dx7mn\") pod \"ovn-controller-ovs-9nf99\" (UID: \"24e0750a-cba0-4fc3-8ff5-9ed716525dee\") " pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.767836 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:21 crc kubenswrapper[4747]: I0930 19:03:21.774338 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.075252 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.077252 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.079416 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.081628 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.081814 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.083369 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.083699 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.089467 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-sbkg7" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.147324 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s68ld\" (UniqueName: \"kubernetes.io/projected/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-kube-api-access-s68ld\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.147402 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-config\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.147445 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.147519 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.147634 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.147824 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.147883 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.147989 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: E0930 19:03:23.170245 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 19:03:23 crc kubenswrapper[4747]: E0930 19:03:23.170477 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5ckf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-j7gc9_openstack(337c81e9-fe53-4437-8a46-3d10f4d195e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:03:23 crc kubenswrapper[4747]: E0930 19:03:23.184361 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" podUID="337c81e9-fe53-4437-8a46-3d10f4d195e1" Sep 30 19:03:23 crc kubenswrapper[4747]: E0930 19:03:23.213742 4747 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 19:03:23 crc kubenswrapper[4747]: E0930 19:03:23.214270 4747 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njpjz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-6jj7d_openstack(57ad5ca4-3275-4ab3-9818-c2afce879d04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:03:23 crc kubenswrapper[4747]: E0930 19:03:23.219145 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" podUID="57ad5ca4-3275-4ab3-9818-c2afce879d04" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.249773 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.249813 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.249845 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.249874 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68ld\" (UniqueName: \"kubernetes.io/projected/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-kube-api-access-s68ld\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.249895 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-config\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.249916 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.249955 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.249987 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.251064 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.251739 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.256624 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.257119 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-config\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.258896 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.259026 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.259610 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.274484 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.306345 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68ld\" (UniqueName: \"kubernetes.io/projected/cbbbaf74-0d3f-45fb-8b60-e6edb738ebab-kube-api-access-s68ld\") pod \"ovsdbserver-nb-0\" (UID: \"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.307560 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.309037 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.314730 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fgngr" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.315129 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.317231 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.319187 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.320735 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.365225 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.365288 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.365488 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.365675 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-config\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.365729 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.365767 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jr5z\" (UniqueName: \"kubernetes.io/projected/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-kube-api-access-9jr5z\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.365796 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.365848 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.404302 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.467538 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.467653 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.467679 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.467749 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.467846 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-config\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.467849 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.470318 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-config\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.470600 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.470811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jr5z\" (UniqueName: \"kubernetes.io/projected/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-kube-api-access-9jr5z\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.470836 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.470976 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.471745 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.476747 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.478962 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.479713 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.496515 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jr5z\" (UniqueName: \"kubernetes.io/projected/6c1ea28e-2148-476c-9493-8e4be1d4dfa1-kube-api-access-9jr5z\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.500018 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6c1ea28e-2148-476c-9493-8e4be1d4dfa1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.673338 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.701708 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.731680 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xgjfm"] Sep 30 19:03:23 crc kubenswrapper[4747]: W0930 19:03:23.745368 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7211e36c_ce8d_434e_9952_e5f5eb7097ec.slice/crio-6ead31b7ebe72a91ab6ab0ab43c515e0b977b93f794599f2a9a62a4f0baa7d19 WatchSource:0}: Error finding container 6ead31b7ebe72a91ab6ab0ab43c515e0b977b93f794599f2a9a62a4f0baa7d19: Status 404 returned error can't find the container with id 6ead31b7ebe72a91ab6ab0ab43c515e0b977b93f794599f2a9a62a4f0baa7d19 Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.768598 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.833656 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 19:03:23 crc kubenswrapper[4747]: W0930 19:03:23.844692 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ffa91d2_f90c_4b61_be02_28351b9d7d30.slice/crio-4aea49b2da8228310117769294c42bbcb8f3bdae7e1c510f23dbf6e49f14488f WatchSource:0}: Error finding container 4aea49b2da8228310117769294c42bbcb8f3bdae7e1c510f23dbf6e49f14488f: Status 404 returned error can't find the container with id 4aea49b2da8228310117769294c42bbcb8f3bdae7e1c510f23dbf6e49f14488f Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.896215 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 19:03:23 crc kubenswrapper[4747]: W0930 19:03:23.911510 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbe40b05_41bb_4a7a_9d8d_24d9117d8bc6.slice/crio-2f758020e5f4a2906f9a7aed89de78ff6e8a90dab9f0d3bb864313dfbf94da97 WatchSource:0}: Error finding container 2f758020e5f4a2906f9a7aed89de78ff6e8a90dab9f0d3bb864313dfbf94da97: Status 404 returned error can't find the container with id 2f758020e5f4a2906f9a7aed89de78ff6e8a90dab9f0d3bb864313dfbf94da97 Sep 30 19:03:23 crc kubenswrapper[4747]: I0930 19:03:23.912168 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.030780 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 19:03:24 crc kubenswrapper[4747]: W0930 19:03:24.038350 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c1ea28e_2148_476c_9493_8e4be1d4dfa1.slice/crio-fa82c20a94e12bdc80b888a16921560e200e03c104e751a6f64dc212ae5c76aa WatchSource:0}: Error finding container fa82c20a94e12bdc80b888a16921560e200e03c104e751a6f64dc212ae5c76aa: Status 404 returned error can't find the container with id fa82c20a94e12bdc80b888a16921560e200e03c104e751a6f64dc212ae5c76aa Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.125505 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 19:03:24 crc kubenswrapper[4747]: W0930 19:03:24.127994 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbbbaf74_0d3f_45fb_8b60_e6edb738ebab.slice/crio-32979d8bf88677a951567a8ba72acfc8aa9a5d399660e7a980d12ebe8c0053e1 WatchSource:0}: Error finding container 32979d8bf88677a951567a8ba72acfc8aa9a5d399660e7a980d12ebe8c0053e1: Status 404 returned error can't find the container with id 32979d8bf88677a951567a8ba72acfc8aa9a5d399660e7a980d12ebe8c0053e1 Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.213737 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xgjfm" event={"ID":"7211e36c-ce8d-434e-9952-e5f5eb7097ec","Type":"ContainerStarted","Data":"6ead31b7ebe72a91ab6ab0ab43c515e0b977b93f794599f2a9a62a4f0baa7d19"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.215073 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ae06b6d-5f75-46a4-8805-0d99f3771c71","Type":"ContainerStarted","Data":"71b38397cba70511f284d11a86ad716b2d1486deaf60c371c0aa920b60a31a00"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.217285 4747 generic.go:334] "Generic (PLEG): container finished" podID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerID="53c39d999b40d314ad70f97c4d6eefe6fc4aaff86037d7ebfb57482511ac3be8" exitCode=0 Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.217335 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" event={"ID":"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61","Type":"ContainerDied","Data":"53c39d999b40d314ad70f97c4d6eefe6fc4aaff86037d7ebfb57482511ac3be8"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.219853 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2b7e2ab9-2510-47d9-b1c6-3562b1c968be","Type":"ContainerStarted","Data":"244a1a95409cfc69d9d384fd88517cf7443a8c72413444abc21aa88215d41d2c"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.222529 4747 generic.go:334] "Generic (PLEG): container finished" podID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerID="946f3a92ed75aee34d540a964276f314f5f7ce0eebdc1474aafea5c1b31be638" exitCode=0 Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.222578 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" event={"ID":"7f34d2ac-ba27-42df-93dd-a7803fb7220c","Type":"ContainerDied","Data":"946f3a92ed75aee34d540a964276f314f5f7ce0eebdc1474aafea5c1b31be638"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.224704 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6","Type":"ContainerStarted","Data":"2f758020e5f4a2906f9a7aed89de78ff6e8a90dab9f0d3bb864313dfbf94da97"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.227657 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6ffa91d2-f90c-4b61-be02-28351b9d7d30","Type":"ContainerStarted","Data":"4aea49b2da8228310117769294c42bbcb8f3bdae7e1c510f23dbf6e49f14488f"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.229216 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6c1ea28e-2148-476c-9493-8e4be1d4dfa1","Type":"ContainerStarted","Data":"fa82c20a94e12bdc80b888a16921560e200e03c104e751a6f64dc212ae5c76aa"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.235974 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"008f0030-d04c-427c-bc09-76874ee17b16","Type":"ContainerStarted","Data":"d0359479feb1e4cf48cba72154c9e5d5a1037dd20a1f99b6ee4a872e31d7511f"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.241253 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab","Type":"ContainerStarted","Data":"32979d8bf88677a951567a8ba72acfc8aa9a5d399660e7a980d12ebe8c0053e1"} Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.640514 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.678342 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.684129 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9nf99"] Sep 30 19:03:24 crc kubenswrapper[4747]: W0930 19:03:24.699417 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24e0750a_cba0_4fc3_8ff5_9ed716525dee.slice/crio-49b8c469f95aafe38b06fbbccc6201047134ca8264edb7a18d501b4015e0e323 WatchSource:0}: Error finding container 49b8c469f95aafe38b06fbbccc6201047134ca8264edb7a18d501b4015e0e323: Status 404 returned error can't find the container with id 49b8c469f95aafe38b06fbbccc6201047134ca8264edb7a18d501b4015e0e323 Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.707980 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ad5ca4-3275-4ab3-9818-c2afce879d04-config\") pod \"57ad5ca4-3275-4ab3-9818-c2afce879d04\" (UID: \"57ad5ca4-3275-4ab3-9818-c2afce879d04\") " Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.708376 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njpjz\" (UniqueName: \"kubernetes.io/projected/57ad5ca4-3275-4ab3-9818-c2afce879d04-kube-api-access-njpjz\") pod \"57ad5ca4-3275-4ab3-9818-c2afce879d04\" (UID: \"57ad5ca4-3275-4ab3-9818-c2afce879d04\") " Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.708694 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ad5ca4-3275-4ab3-9818-c2afce879d04-config" (OuterVolumeSpecName: "config") pod "57ad5ca4-3275-4ab3-9818-c2afce879d04" (UID: "57ad5ca4-3275-4ab3-9818-c2afce879d04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.709195 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ad5ca4-3275-4ab3-9818-c2afce879d04-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.716338 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ad5ca4-3275-4ab3-9818-c2afce879d04-kube-api-access-njpjz" (OuterVolumeSpecName: "kube-api-access-njpjz") pod "57ad5ca4-3275-4ab3-9818-c2afce879d04" (UID: "57ad5ca4-3275-4ab3-9818-c2afce879d04"). InnerVolumeSpecName "kube-api-access-njpjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.810780 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-dns-svc\") pod \"337c81e9-fe53-4437-8a46-3d10f4d195e1\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.810889 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5ckf\" (UniqueName: \"kubernetes.io/projected/337c81e9-fe53-4437-8a46-3d10f4d195e1-kube-api-access-v5ckf\") pod \"337c81e9-fe53-4437-8a46-3d10f4d195e1\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.811017 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-config\") pod \"337c81e9-fe53-4437-8a46-3d10f4d195e1\" (UID: \"337c81e9-fe53-4437-8a46-3d10f4d195e1\") " Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.811505 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njpjz\" (UniqueName: \"kubernetes.io/projected/57ad5ca4-3275-4ab3-9818-c2afce879d04-kube-api-access-njpjz\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.811882 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-config" (OuterVolumeSpecName: "config") pod "337c81e9-fe53-4437-8a46-3d10f4d195e1" (UID: "337c81e9-fe53-4437-8a46-3d10f4d195e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.812435 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "337c81e9-fe53-4437-8a46-3d10f4d195e1" (UID: "337c81e9-fe53-4437-8a46-3d10f4d195e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.814756 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337c81e9-fe53-4437-8a46-3d10f4d195e1-kube-api-access-v5ckf" (OuterVolumeSpecName: "kube-api-access-v5ckf") pod "337c81e9-fe53-4437-8a46-3d10f4d195e1" (UID: "337c81e9-fe53-4437-8a46-3d10f4d195e1"). InnerVolumeSpecName "kube-api-access-v5ckf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.913004 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.913033 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5ckf\" (UniqueName: \"kubernetes.io/projected/337c81e9-fe53-4437-8a46-3d10f4d195e1-kube-api-access-v5ckf\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:24 crc kubenswrapper[4747]: I0930 19:03:24.913051 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337c81e9-fe53-4437-8a46-3d10f4d195e1-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.249466 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9nf99" event={"ID":"24e0750a-cba0-4fc3-8ff5-9ed716525dee","Type":"ContainerStarted","Data":"49b8c469f95aafe38b06fbbccc6201047134ca8264edb7a18d501b4015e0e323"} Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.252849 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" event={"ID":"337c81e9-fe53-4437-8a46-3d10f4d195e1","Type":"ContainerDied","Data":"79da587b09841fed0f812202b56e255b30abd30b0e200e166ab4c777eb4f39d6"} Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.252913 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j7gc9" Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.255662 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" event={"ID":"7f34d2ac-ba27-42df-93dd-a7803fb7220c","Type":"ContainerStarted","Data":"79c6e453182afe6440085ff9349d1f66051d1ecd16ac1d47372344da70ff8b66"} Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.255854 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.258318 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" event={"ID":"57ad5ca4-3275-4ab3-9818-c2afce879d04","Type":"ContainerDied","Data":"021008d95c1965ab5a86221d156b4ede739850c0905efbedce98c361f308545d"} Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.258455 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6jj7d" Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.261443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" event={"ID":"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61","Type":"ContainerStarted","Data":"dc43e5a803f1901c97a328164fb74acd077319959ee1cf955e3ff98e22c191b8"} Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.261611 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.281289 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" podStartSLOduration=6.775586988 podStartE2EDuration="15.281264423s" podCreationTimestamp="2025-09-30 19:03:10 +0000 UTC" firstStartedPulling="2025-09-30 19:03:14.815181274 +0000 UTC m=+1034.474661388" lastFinishedPulling="2025-09-30 19:03:23.320858709 +0000 UTC m=+1042.980338823" observedRunningTime="2025-09-30 19:03:25.271454642 +0000 UTC m=+1044.930934776" watchObservedRunningTime="2025-09-30 19:03:25.281264423 +0000 UTC m=+1044.940744537" Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.301694 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" podStartSLOduration=6.793897402 podStartE2EDuration="15.301671627s" podCreationTimestamp="2025-09-30 19:03:10 +0000 UTC" firstStartedPulling="2025-09-30 19:03:14.822616527 +0000 UTC m=+1034.482096631" lastFinishedPulling="2025-09-30 19:03:23.330390752 +0000 UTC m=+1042.989870856" observedRunningTime="2025-09-30 19:03:25.291449625 +0000 UTC m=+1044.950929739" watchObservedRunningTime="2025-09-30 19:03:25.301671627 +0000 UTC m=+1044.961151741" Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.347665 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j7gc9"] Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.356652 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j7gc9"] Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.365870 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6jj7d"] Sep 30 19:03:25 crc kubenswrapper[4747]: I0930 19:03:25.369572 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6jj7d"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.103391 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="337c81e9-fe53-4437-8a46-3d10f4d195e1" path="/var/lib/kubelet/pods/337c81e9-fe53-4437-8a46-3d10f4d195e1/volumes" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.105026 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ad5ca4-3275-4ab3-9818-c2afce879d04" path="/var/lib/kubelet/pods/57ad5ca4-3275-4ab3-9818-c2afce879d04/volumes" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.456834 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ljrn6"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.457901 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.469244 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.475335 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ljrn6"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.572983 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/03e4d2ba-8585-4342-8f1b-4ef78d65911b-ovs-rundir\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.573054 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e4d2ba-8585-4342-8f1b-4ef78d65911b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.574215 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e4d2ba-8585-4342-8f1b-4ef78d65911b-combined-ca-bundle\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.574246 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjwl\" (UniqueName: \"kubernetes.io/projected/03e4d2ba-8585-4342-8f1b-4ef78d65911b-kube-api-access-pvjwl\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.574283 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e4d2ba-8585-4342-8f1b-4ef78d65911b-config\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.574585 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/03e4d2ba-8585-4342-8f1b-4ef78d65911b-ovn-rundir\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.620789 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-77gkx"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.621053 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" podUID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerName="dnsmasq-dns" containerID="cri-o://dc43e5a803f1901c97a328164fb74acd077319959ee1cf955e3ff98e22c191b8" gracePeriod=10 Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.634274 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-v8hgj"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.635608 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.649825 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-v8hgj"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.649975 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.676479 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/03e4d2ba-8585-4342-8f1b-4ef78d65911b-ovn-rundir\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.676563 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/03e4d2ba-8585-4342-8f1b-4ef78d65911b-ovs-rundir\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.676604 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e4d2ba-8585-4342-8f1b-4ef78d65911b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.676629 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e4d2ba-8585-4342-8f1b-4ef78d65911b-combined-ca-bundle\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.676657 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjwl\" (UniqueName: \"kubernetes.io/projected/03e4d2ba-8585-4342-8f1b-4ef78d65911b-kube-api-access-pvjwl\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.676688 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e4d2ba-8585-4342-8f1b-4ef78d65911b-config\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.676856 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/03e4d2ba-8585-4342-8f1b-4ef78d65911b-ovn-rundir\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.677025 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/03e4d2ba-8585-4342-8f1b-4ef78d65911b-ovs-rundir\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.677451 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03e4d2ba-8585-4342-8f1b-4ef78d65911b-config\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.685088 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e4d2ba-8585-4342-8f1b-4ef78d65911b-combined-ca-bundle\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.703557 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjwl\" (UniqueName: \"kubernetes.io/projected/03e4d2ba-8585-4342-8f1b-4ef78d65911b-kube-api-access-pvjwl\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.721534 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e4d2ba-8585-4342-8f1b-4ef78d65911b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ljrn6\" (UID: \"03e4d2ba-8585-4342-8f1b-4ef78d65911b\") " pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.778663 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7fwj\" (UniqueName: \"kubernetes.io/projected/e6b1d501-7cf4-4b68-ab10-e38d21256aac-kube-api-access-d7fwj\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.778774 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.778826 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-config\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.778970 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.800590 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ljrn6" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.811228 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nqz9b"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.811457 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" podUID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerName="dnsmasq-dns" containerID="cri-o://79c6e453182afe6440085ff9349d1f66051d1ecd16ac1d47372344da70ff8b66" gracePeriod=10 Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.836941 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-xx25s"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.838729 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.841519 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.850014 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xx25s"] Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.880850 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.880950 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7fwj\" (UniqueName: \"kubernetes.io/projected/e6b1d501-7cf4-4b68-ab10-e38d21256aac-kube-api-access-d7fwj\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.881033 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.881067 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-config\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.882177 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-config\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.882891 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.883492 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.914025 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7fwj\" (UniqueName: \"kubernetes.io/projected/e6b1d501-7cf4-4b68-ab10-e38d21256aac-kube-api-access-d7fwj\") pod \"dnsmasq-dns-6bc7876d45-v8hgj\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.967027 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.983174 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zkfh\" (UniqueName: \"kubernetes.io/projected/826e56d5-0794-46e2-b143-e331bc22358e-kube-api-access-8zkfh\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.983243 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-dns-svc\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.983284 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.983633 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:27 crc kubenswrapper[4747]: I0930 19:03:27.983774 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-config\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.085766 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zkfh\" (UniqueName: \"kubernetes.io/projected/826e56d5-0794-46e2-b143-e331bc22358e-kube-api-access-8zkfh\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.085863 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-dns-svc\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.085911 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.085975 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.086006 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-config\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.086939 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-config\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.087537 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.087599 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-dns-svc\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.087629 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.104103 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zkfh\" (UniqueName: \"kubernetes.io/projected/826e56d5-0794-46e2-b143-e331bc22358e-kube-api-access-8zkfh\") pod \"dnsmasq-dns-8554648995-xx25s\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.236413 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.288033 4747 generic.go:334] "Generic (PLEG): container finished" podID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerID="dc43e5a803f1901c97a328164fb74acd077319959ee1cf955e3ff98e22c191b8" exitCode=0 Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.288112 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" event={"ID":"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61","Type":"ContainerDied","Data":"dc43e5a803f1901c97a328164fb74acd077319959ee1cf955e3ff98e22c191b8"} Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.290166 4747 generic.go:334] "Generic (PLEG): container finished" podID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerID="79c6e453182afe6440085ff9349d1f66051d1ecd16ac1d47372344da70ff8b66" exitCode=0 Sep 30 19:03:28 crc kubenswrapper[4747]: I0930 19:03:28.290217 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" event={"ID":"7f34d2ac-ba27-42df-93dd-a7803fb7220c","Type":"ContainerDied","Data":"79c6e453182afe6440085ff9349d1f66051d1ecd16ac1d47372344da70ff8b66"} Sep 30 19:03:31 crc kubenswrapper[4747]: I0930 19:03:31.037110 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" podUID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.102:5353: connect: connection refused" Sep 30 19:03:31 crc kubenswrapper[4747]: I0930 19:03:31.364010 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" podUID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.103:5353: connect: connection refused" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.619723 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.630621 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.672317 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-config\") pod \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.672444 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxp4m\" (UniqueName: \"kubernetes.io/projected/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-kube-api-access-lxp4m\") pod \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.672472 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdtf8\" (UniqueName: \"kubernetes.io/projected/7f34d2ac-ba27-42df-93dd-a7803fb7220c-kube-api-access-pdtf8\") pod \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.672490 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-config\") pod \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.672624 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-dns-svc\") pod \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\" (UID: \"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61\") " Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.672663 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-dns-svc\") pod \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\" (UID: \"7f34d2ac-ba27-42df-93dd-a7803fb7220c\") " Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.676904 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-kube-api-access-lxp4m" (OuterVolumeSpecName: "kube-api-access-lxp4m") pod "f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" (UID: "f275f9b3-a8e7-4513-ae6c-dd1683bd6f61"). InnerVolumeSpecName "kube-api-access-lxp4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.684232 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f34d2ac-ba27-42df-93dd-a7803fb7220c-kube-api-access-pdtf8" (OuterVolumeSpecName: "kube-api-access-pdtf8") pod "7f34d2ac-ba27-42df-93dd-a7803fb7220c" (UID: "7f34d2ac-ba27-42df-93dd-a7803fb7220c"). InnerVolumeSpecName "kube-api-access-pdtf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.777289 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxp4m\" (UniqueName: \"kubernetes.io/projected/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-kube-api-access-lxp4m\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.777489 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdtf8\" (UniqueName: \"kubernetes.io/projected/7f34d2ac-ba27-42df-93dd-a7803fb7220c-kube-api-access-pdtf8\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.845355 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" (UID: "f275f9b3-a8e7-4513-ae6c-dd1683bd6f61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.879245 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.919283 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-config" (OuterVolumeSpecName: "config") pod "7f34d2ac-ba27-42df-93dd-a7803fb7220c" (UID: "7f34d2ac-ba27-42df-93dd-a7803fb7220c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.933003 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-config" (OuterVolumeSpecName: "config") pod "f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" (UID: "f275f9b3-a8e7-4513-ae6c-dd1683bd6f61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.938695 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f34d2ac-ba27-42df-93dd-a7803fb7220c" (UID: "7f34d2ac-ba27-42df-93dd-a7803fb7220c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.981049 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.981083 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f34d2ac-ba27-42df-93dd-a7803fb7220c-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:32 crc kubenswrapper[4747]: I0930 19:03:32.981092 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.071002 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ljrn6"] Sep 30 19:03:33 crc kubenswrapper[4747]: W0930 19:03:33.096885 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b1d501_7cf4_4b68_ab10_e38d21256aac.slice/crio-e8ea2766242ea1534757e1075b37ae6a884070e873c4d23eb94a492064437158 WatchSource:0}: Error finding container e8ea2766242ea1534757e1075b37ae6a884070e873c4d23eb94a492064437158: Status 404 returned error can't find the container with id e8ea2766242ea1534757e1075b37ae6a884070e873c4d23eb94a492064437158 Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.112858 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-v8hgj"] Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.244897 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xx25s"] Sep 30 19:03:33 crc kubenswrapper[4747]: W0930 19:03:33.254133 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod826e56d5_0794_46e2_b143_e331bc22358e.slice/crio-5a905a304b728fbe2d502121afe7d0106bf5103b9d67430ff14cf5446605c217 WatchSource:0}: Error finding container 5a905a304b728fbe2d502121afe7d0106bf5103b9d67430ff14cf5446605c217: Status 404 returned error can't find the container with id 5a905a304b728fbe2d502121afe7d0106bf5103b9d67430ff14cf5446605c217 Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.336936 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab","Type":"ContainerStarted","Data":"4eaf12081def8be99b4b6c591bfbfda9c5d13862611354c003df89eefad73c08"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.339281 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6ffa91d2-f90c-4b61-be02-28351b9d7d30","Type":"ContainerStarted","Data":"44d043f2e24a3b771d06a8414a874cccdfe672aa50a3eecb4f461db24a2fa6f0"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.341446 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ae06b6d-5f75-46a4-8805-0d99f3771c71","Type":"ContainerStarted","Data":"a69a5be0b03ee2fbce9b63b332d509f363b3231f62fe0ed5156cadc2473dd792"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.343610 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xgjfm" event={"ID":"7211e36c-ce8d-434e-9952-e5f5eb7097ec","Type":"ContainerStarted","Data":"33f525f6668efbaed1125019a66a6871d7ecb92f9def5ab3b3618c4b5df02533"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.343671 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xgjfm" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.345482 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2b7e2ab9-2510-47d9-b1c6-3562b1c968be","Type":"ContainerStarted","Data":"5cc123c53b2169b4fa0b30ecc22ff7f0d6e1bcca1711f6490ffa5d16062e5c02"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.347697 4747 generic.go:334] "Generic (PLEG): container finished" podID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" containerID="f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506" exitCode=0 Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.347757 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" event={"ID":"e6b1d501-7cf4-4b68-ab10-e38d21256aac","Type":"ContainerDied","Data":"f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.347781 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" event={"ID":"e6b1d501-7cf4-4b68-ab10-e38d21256aac","Type":"ContainerStarted","Data":"e8ea2766242ea1534757e1075b37ae6a884070e873c4d23eb94a492064437158"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.348727 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xx25s" event={"ID":"826e56d5-0794-46e2-b143-e331bc22358e","Type":"ContainerStarted","Data":"5a905a304b728fbe2d502121afe7d0106bf5103b9d67430ff14cf5446605c217"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.351738 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.351775 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-77gkx" event={"ID":"f275f9b3-a8e7-4513-ae6c-dd1683bd6f61","Type":"ContainerDied","Data":"16bbbf7af3f6f49bf3e9b321490957b1db74df22ec2a2613db32b27ba45a3680"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.351836 4747 scope.go:117] "RemoveContainer" containerID="dc43e5a803f1901c97a328164fb74acd077319959ee1cf955e3ff98e22c191b8" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.354837 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ljrn6" event={"ID":"03e4d2ba-8585-4342-8f1b-4ef78d65911b","Type":"ContainerStarted","Data":"05d51f8535025140f130512e6f24f514c1851af166969f5b51ee06836e6d18db"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.356246 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" event={"ID":"7f34d2ac-ba27-42df-93dd-a7803fb7220c","Type":"ContainerDied","Data":"250741bc0496a40033408ded9ea95ebf7dcf77f6c77322e055413118f2fb631d"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.356343 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nqz9b" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.363558 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"008f0030-d04c-427c-bc09-76874ee17b16","Type":"ContainerStarted","Data":"eb4b076070fdd02a8029a6b18c6ba95aa0b2e9b34e79b3c11b8497047b5c2f6e"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.367884 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6c1ea28e-2148-476c-9493-8e4be1d4dfa1","Type":"ContainerStarted","Data":"9852aa0867ef27218d20bf7601e45e951181e34f34bc268ccc222dadc4b65768"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.383227 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6","Type":"ContainerStarted","Data":"487d3718b1c6fb6bd5328513061445f44088eff756e153c43c35cb9dc00959ed"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.384237 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.386666 4747 scope.go:117] "RemoveContainer" containerID="53c39d999b40d314ad70f97c4d6eefe6fc4aaff86037d7ebfb57482511ac3be8" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.397035 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9nf99" event={"ID":"24e0750a-cba0-4fc3-8ff5-9ed716525dee","Type":"ContainerStarted","Data":"ae9342b102bf62876a0f3c81c24b29f7e94dc16b6cbdd5d1fd60a6d12029f515"} Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.427262 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nqz9b"] Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.431096 4747 scope.go:117] "RemoveContainer" containerID="79c6e453182afe6440085ff9349d1f66051d1ecd16ac1d47372344da70ff8b66" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.433788 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nqz9b"] Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.439863 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xgjfm" podStartSLOduration=3.614345587 podStartE2EDuration="12.439846726s" podCreationTimestamp="2025-09-30 19:03:21 +0000 UTC" firstStartedPulling="2025-09-30 19:03:23.748506728 +0000 UTC m=+1043.407986842" lastFinishedPulling="2025-09-30 19:03:32.574007827 +0000 UTC m=+1052.233487981" observedRunningTime="2025-09-30 19:03:33.439174667 +0000 UTC m=+1053.098654791" watchObservedRunningTime="2025-09-30 19:03:33.439846726 +0000 UTC m=+1053.099326840" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.456459 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-77gkx"] Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.459116 4747 scope.go:117] "RemoveContainer" containerID="946f3a92ed75aee34d540a964276f314f5f7ce0eebdc1474aafea5c1b31be638" Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.462071 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-77gkx"] Sep 30 19:03:33 crc kubenswrapper[4747]: I0930 19:03:33.560382 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=9.949482448 podStartE2EDuration="18.560352785s" podCreationTimestamp="2025-09-30 19:03:15 +0000 UTC" firstStartedPulling="2025-09-30 19:03:23.913379306 +0000 UTC m=+1043.572859420" lastFinishedPulling="2025-09-30 19:03:32.524249643 +0000 UTC m=+1052.183729757" observedRunningTime="2025-09-30 19:03:33.555608879 +0000 UTC m=+1053.215088993" watchObservedRunningTime="2025-09-30 19:03:33.560352785 +0000 UTC m=+1053.219832899" Sep 30 19:03:34 crc kubenswrapper[4747]: I0930 19:03:34.406742 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" event={"ID":"e6b1d501-7cf4-4b68-ab10-e38d21256aac","Type":"ContainerStarted","Data":"bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08"} Sep 30 19:03:34 crc kubenswrapper[4747]: I0930 19:03:34.407072 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:34 crc kubenswrapper[4747]: I0930 19:03:34.410597 4747 generic.go:334] "Generic (PLEG): container finished" podID="826e56d5-0794-46e2-b143-e331bc22358e" containerID="10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f" exitCode=0 Sep 30 19:03:34 crc kubenswrapper[4747]: I0930 19:03:34.410669 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xx25s" event={"ID":"826e56d5-0794-46e2-b143-e331bc22358e","Type":"ContainerDied","Data":"10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f"} Sep 30 19:03:34 crc kubenswrapper[4747]: I0930 19:03:34.415311 4747 generic.go:334] "Generic (PLEG): container finished" podID="24e0750a-cba0-4fc3-8ff5-9ed716525dee" containerID="ae9342b102bf62876a0f3c81c24b29f7e94dc16b6cbdd5d1fd60a6d12029f515" exitCode=0 Sep 30 19:03:34 crc kubenswrapper[4747]: I0930 19:03:34.415479 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9nf99" event={"ID":"24e0750a-cba0-4fc3-8ff5-9ed716525dee","Type":"ContainerDied","Data":"ae9342b102bf62876a0f3c81c24b29f7e94dc16b6cbdd5d1fd60a6d12029f515"} Sep 30 19:03:34 crc kubenswrapper[4747]: I0930 19:03:34.430571 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" podStartSLOduration=7.430541257 podStartE2EDuration="7.430541257s" podCreationTimestamp="2025-09-30 19:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:03:34.425647757 +0000 UTC m=+1054.085127871" watchObservedRunningTime="2025-09-30 19:03:34.430541257 +0000 UTC m=+1054.090021371" Sep 30 19:03:35 crc kubenswrapper[4747]: I0930 19:03:35.102654 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" path="/var/lib/kubelet/pods/7f34d2ac-ba27-42df-93dd-a7803fb7220c/volumes" Sep 30 19:03:35 crc kubenswrapper[4747]: I0930 19:03:35.103819 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" path="/var/lib/kubelet/pods/f275f9b3-a8e7-4513-ae6c-dd1683bd6f61/volumes" Sep 30 19:03:35 crc kubenswrapper[4747]: I0930 19:03:35.427751 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9nf99" event={"ID":"24e0750a-cba0-4fc3-8ff5-9ed716525dee","Type":"ContainerStarted","Data":"592516fbaad0246970210898f6636430e1902bffa43082975afa827e74ada981"} Sep 30 19:03:35 crc kubenswrapper[4747]: I0930 19:03:35.429918 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xx25s" event={"ID":"826e56d5-0794-46e2-b143-e331bc22358e","Type":"ContainerStarted","Data":"a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235"} Sep 30 19:03:35 crc kubenswrapper[4747]: I0930 19:03:35.456842 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-xx25s" podStartSLOduration=8.456825248 podStartE2EDuration="8.456825248s" podCreationTimestamp="2025-09-30 19:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:03:35.449887769 +0000 UTC m=+1055.109367883" watchObservedRunningTime="2025-09-30 19:03:35.456825248 +0000 UTC m=+1055.116305362" Sep 30 19:03:36 crc kubenswrapper[4747]: I0930 19:03:36.438000 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9nf99" event={"ID":"24e0750a-cba0-4fc3-8ff5-9ed716525dee","Type":"ContainerStarted","Data":"f9511909623acbf0d0342cece7c07a13f3dfa2d4a799685ff0bfc823716e8df4"} Sep 30 19:03:36 crc kubenswrapper[4747]: I0930 19:03:36.438596 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:36 crc kubenswrapper[4747]: I0930 19:03:36.470856 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9nf99" podStartSLOduration=7.646051616 podStartE2EDuration="15.470836067s" podCreationTimestamp="2025-09-30 19:03:21 +0000 UTC" firstStartedPulling="2025-09-30 19:03:24.707859583 +0000 UTC m=+1044.367339697" lastFinishedPulling="2025-09-30 19:03:32.532644014 +0000 UTC m=+1052.192124148" observedRunningTime="2025-09-30 19:03:36.468819889 +0000 UTC m=+1056.128300013" watchObservedRunningTime="2025-09-30 19:03:36.470836067 +0000 UTC m=+1056.130316201" Sep 30 19:03:36 crc kubenswrapper[4747]: I0930 19:03:36.775238 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:36 crc kubenswrapper[4747]: I0930 19:03:36.775596 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.450485 4747 generic.go:334] "Generic (PLEG): container finished" podID="6ffa91d2-f90c-4b61-be02-28351b9d7d30" containerID="44d043f2e24a3b771d06a8414a874cccdfe672aa50a3eecb4f461db24a2fa6f0" exitCode=0 Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.450567 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6ffa91d2-f90c-4b61-be02-28351b9d7d30","Type":"ContainerDied","Data":"44d043f2e24a3b771d06a8414a874cccdfe672aa50a3eecb4f461db24a2fa6f0"} Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.453329 4747 generic.go:334] "Generic (PLEG): container finished" podID="2b7e2ab9-2510-47d9-b1c6-3562b1c968be" containerID="5cc123c53b2169b4fa0b30ecc22ff7f0d6e1bcca1711f6490ffa5d16062e5c02" exitCode=0 Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.453375 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2b7e2ab9-2510-47d9-b1c6-3562b1c968be","Type":"ContainerDied","Data":"5cc123c53b2169b4fa0b30ecc22ff7f0d6e1bcca1711f6490ffa5d16062e5c02"} Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.457992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cbbbaf74-0d3f-45fb-8b60-e6edb738ebab","Type":"ContainerStarted","Data":"bf904082d1742a41e3416b23501dd9ff03c241ea8d269cce61d07df6671dc801"} Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.462403 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ljrn6" event={"ID":"03e4d2ba-8585-4342-8f1b-4ef78d65911b","Type":"ContainerStarted","Data":"a90e3abe06cd9b4c25d148cfd192dee3700448851b5a824837915d0fcb5918ef"} Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.470713 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6c1ea28e-2148-476c-9493-8e4be1d4dfa1","Type":"ContainerStarted","Data":"d73e096d21f232ab9df219384c0816e7917b26b5a0c1ee1235c7bfe000db6a1d"} Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.550937 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ljrn6" podStartSLOduration=7.215431532 podStartE2EDuration="10.550877436s" podCreationTimestamp="2025-09-30 19:03:27 +0000 UTC" firstStartedPulling="2025-09-30 19:03:33.089359436 +0000 UTC m=+1052.748839540" lastFinishedPulling="2025-09-30 19:03:36.42480533 +0000 UTC m=+1056.084285444" observedRunningTime="2025-09-30 19:03:37.536452773 +0000 UTC m=+1057.195932917" watchObservedRunningTime="2025-09-30 19:03:37.550877436 +0000 UTC m=+1057.210357570" Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.591028 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.352966986 podStartE2EDuration="15.591006635s" podCreationTimestamp="2025-09-30 19:03:22 +0000 UTC" firstStartedPulling="2025-09-30 19:03:24.13222722 +0000 UTC m=+1043.791707374" lastFinishedPulling="2025-09-30 19:03:36.370266869 +0000 UTC m=+1056.029747023" observedRunningTime="2025-09-30 19:03:37.579626179 +0000 UTC m=+1057.239106313" watchObservedRunningTime="2025-09-30 19:03:37.591006635 +0000 UTC m=+1057.250486749" Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.615371 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.255131075 podStartE2EDuration="15.615352181s" podCreationTimestamp="2025-09-30 19:03:22 +0000 UTC" firstStartedPulling="2025-09-30 19:03:24.041491973 +0000 UTC m=+1043.700972087" lastFinishedPulling="2025-09-30 19:03:36.401713069 +0000 UTC m=+1056.061193193" observedRunningTime="2025-09-30 19:03:37.605422557 +0000 UTC m=+1057.264902691" watchObservedRunningTime="2025-09-30 19:03:37.615352181 +0000 UTC m=+1057.274832295" Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.655802 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:03:37 crc kubenswrapper[4747]: I0930 19:03:37.655858 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.406303 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.406691 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.483367 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6ffa91d2-f90c-4b61-be02-28351b9d7d30","Type":"ContainerStarted","Data":"eef9bb7a0bc185503c4ed512834748e441f3b8d04c7c1d65f2b00d29efe57686"} Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.487108 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.488274 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2b7e2ab9-2510-47d9-b1c6-3562b1c968be","Type":"ContainerStarted","Data":"e12252dbbd920c5f77a526b32758c6422969bb1ad9919504fc797a76c42fbe1b"} Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.513622 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=16.82901687 podStartE2EDuration="25.513586646s" podCreationTimestamp="2025-09-30 19:03:13 +0000 UTC" firstStartedPulling="2025-09-30 19:03:23.848037507 +0000 UTC m=+1043.507517631" lastFinishedPulling="2025-09-30 19:03:32.532607293 +0000 UTC m=+1052.192087407" observedRunningTime="2025-09-30 19:03:38.511947619 +0000 UTC m=+1058.171427743" watchObservedRunningTime="2025-09-30 19:03:38.513586646 +0000 UTC m=+1058.173066800" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.543158 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.580201 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=16.88491586 podStartE2EDuration="25.580171632s" podCreationTimestamp="2025-09-30 19:03:13 +0000 UTC" firstStartedPulling="2025-09-30 19:03:23.93448405 +0000 UTC m=+1043.593964164" lastFinishedPulling="2025-09-30 19:03:32.629739822 +0000 UTC m=+1052.289219936" observedRunningTime="2025-09-30 19:03:38.568557039 +0000 UTC m=+1058.228037233" watchObservedRunningTime="2025-09-30 19:03:38.580171632 +0000 UTC m=+1058.239651776" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.702222 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.704137 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:38 crc kubenswrapper[4747]: I0930 19:03:38.764916 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.572499 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.758954 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 19:03:39 crc kubenswrapper[4747]: E0930 19:03:39.759325 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerName="init" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.759346 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerName="init" Sep 30 19:03:39 crc kubenswrapper[4747]: E0930 19:03:39.759365 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerName="init" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.759374 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerName="init" Sep 30 19:03:39 crc kubenswrapper[4747]: E0930 19:03:39.759391 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerName="dnsmasq-dns" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.759401 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerName="dnsmasq-dns" Sep 30 19:03:39 crc kubenswrapper[4747]: E0930 19:03:39.759416 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerName="dnsmasq-dns" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.759447 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerName="dnsmasq-dns" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.759675 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f34d2ac-ba27-42df-93dd-a7803fb7220c" containerName="dnsmasq-dns" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.759697 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f275f9b3-a8e7-4513-ae6c-dd1683bd6f61" containerName="dnsmasq-dns" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.760689 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.764389 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.765245 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.765427 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xcfmv" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.767859 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.784609 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.816426 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-config\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.816508 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-scripts\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.816556 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.816590 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l498k\" (UniqueName: \"kubernetes.io/projected/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-kube-api-access-l498k\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.816621 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.816646 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.816754 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.917708 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.917764 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.917787 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.917814 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-config\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.917859 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-scripts\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.917903 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.917948 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l498k\" (UniqueName: \"kubernetes.io/projected/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-kube-api-access-l498k\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.918776 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.919131 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-config\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.920090 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-scripts\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.923815 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.923877 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.923950 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:39 crc kubenswrapper[4747]: I0930 19:03:39.944144 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l498k\" (UniqueName: \"kubernetes.io/projected/ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58-kube-api-access-l498k\") pod \"ovn-northd-0\" (UID: \"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58\") " pod="openstack/ovn-northd-0" Sep 30 19:03:40 crc kubenswrapper[4747]: I0930 19:03:40.088488 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 19:03:40 crc kubenswrapper[4747]: I0930 19:03:40.615165 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 19:03:40 crc kubenswrapper[4747]: I0930 19:03:40.638558 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 19:03:41 crc kubenswrapper[4747]: I0930 19:03:41.528742 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58","Type":"ContainerStarted","Data":"a12a1c791a8349c0ee8a2ac61b4d644598593ac40f25df39963bf7930f092fd4"} Sep 30 19:03:42 crc kubenswrapper[4747]: I0930 19:03:42.537067 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58","Type":"ContainerStarted","Data":"6fd0447e00e28bb12c864dd6b37e57af1e9b904c44dadfd2aa998b7f665745fa"} Sep 30 19:03:42 crc kubenswrapper[4747]: I0930 19:03:42.537834 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58","Type":"ContainerStarted","Data":"7562c98daae99c94442e36143796934bea52203734430508673418d3b8542a7d"} Sep 30 19:03:42 crc kubenswrapper[4747]: I0930 19:03:42.538072 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 19:03:42 crc kubenswrapper[4747]: I0930 19:03:42.578744 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.3237552089999998 podStartE2EDuration="3.578716913s" podCreationTimestamp="2025-09-30 19:03:39 +0000 UTC" firstStartedPulling="2025-09-30 19:03:40.640996269 +0000 UTC m=+1060.300476423" lastFinishedPulling="2025-09-30 19:03:41.895958013 +0000 UTC m=+1061.555438127" observedRunningTime="2025-09-30 19:03:42.556342652 +0000 UTC m=+1062.215822796" watchObservedRunningTime="2025-09-30 19:03:42.578716913 +0000 UTC m=+1062.238197047" Sep 30 19:03:42 crc kubenswrapper[4747]: I0930 19:03:42.970299 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:43 crc kubenswrapper[4747]: I0930 19:03:43.238114 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:03:43 crc kubenswrapper[4747]: I0930 19:03:43.292090 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-v8hgj"] Sep 30 19:03:43 crc kubenswrapper[4747]: I0930 19:03:43.544036 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" podUID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" containerName="dnsmasq-dns" containerID="cri-o://bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08" gracePeriod=10 Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.070081 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.210977 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-dns-svc\") pod \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.211113 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7fwj\" (UniqueName: \"kubernetes.io/projected/e6b1d501-7cf4-4b68-ab10-e38d21256aac-kube-api-access-d7fwj\") pod \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.212164 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-config\") pod \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.212212 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-ovsdbserver-sb\") pod \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\" (UID: \"e6b1d501-7cf4-4b68-ab10-e38d21256aac\") " Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.218268 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b1d501-7cf4-4b68-ab10-e38d21256aac-kube-api-access-d7fwj" (OuterVolumeSpecName: "kube-api-access-d7fwj") pod "e6b1d501-7cf4-4b68-ab10-e38d21256aac" (UID: "e6b1d501-7cf4-4b68-ab10-e38d21256aac"). InnerVolumeSpecName "kube-api-access-d7fwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.262790 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6b1d501-7cf4-4b68-ab10-e38d21256aac" (UID: "e6b1d501-7cf4-4b68-ab10-e38d21256aac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.264941 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-config" (OuterVolumeSpecName: "config") pod "e6b1d501-7cf4-4b68-ab10-e38d21256aac" (UID: "e6b1d501-7cf4-4b68-ab10-e38d21256aac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.267513 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6b1d501-7cf4-4b68-ab10-e38d21256aac" (UID: "e6b1d501-7cf4-4b68-ab10-e38d21256aac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.314744 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7fwj\" (UniqueName: \"kubernetes.io/projected/e6b1d501-7cf4-4b68-ab10-e38d21256aac-kube-api-access-d7fwj\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.314801 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.314815 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.314825 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b1d501-7cf4-4b68-ab10-e38d21256aac-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.556488 4747 generic.go:334] "Generic (PLEG): container finished" podID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" containerID="bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08" exitCode=0 Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.559763 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.562482 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" event={"ID":"e6b1d501-7cf4-4b68-ab10-e38d21256aac","Type":"ContainerDied","Data":"bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08"} Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.562568 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-v8hgj" event={"ID":"e6b1d501-7cf4-4b68-ab10-e38d21256aac","Type":"ContainerDied","Data":"e8ea2766242ea1534757e1075b37ae6a884070e873c4d23eb94a492064437158"} Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.562595 4747 scope.go:117] "RemoveContainer" containerID="bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.597676 4747 scope.go:117] "RemoveContainer" containerID="f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.610570 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-v8hgj"] Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.617096 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-v8hgj"] Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.709721 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 19:03:44 crc kubenswrapper[4747]: I0930 19:03:44.710898 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 19:03:45 crc kubenswrapper[4747]: I0930 19:03:45.097544 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" path="/var/lib/kubelet/pods/e6b1d501-7cf4-4b68-ab10-e38d21256aac/volumes" Sep 30 19:03:45 crc kubenswrapper[4747]: I0930 19:03:45.138622 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:45 crc kubenswrapper[4747]: I0930 19:03:45.138678 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:46 crc kubenswrapper[4747]: I0930 19:03:46.004226 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 19:03:46 crc kubenswrapper[4747]: I0930 19:03:46.206168 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:46 crc kubenswrapper[4747]: I0930 19:03:46.281278 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="2b7e2ab9-2510-47d9-b1c6-3562b1c968be" containerName="galera" probeResult="failure" output=< Sep 30 19:03:46 crc kubenswrapper[4747]: wsrep_local_state_comment (Joined) differs from Synced Sep 30 19:03:46 crc kubenswrapper[4747]: > Sep 30 19:03:46 crc kubenswrapper[4747]: I0930 19:03:46.648687 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 19:03:48 crc kubenswrapper[4747]: I0930 19:03:48.140413 4747 scope.go:117] "RemoveContainer" containerID="bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08" Sep 30 19:03:48 crc kubenswrapper[4747]: E0930 19:03:48.141065 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08\": container with ID starting with bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08 not found: ID does not exist" containerID="bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08" Sep 30 19:03:48 crc kubenswrapper[4747]: I0930 19:03:48.141140 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08"} err="failed to get container status \"bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08\": rpc error: code = NotFound desc = could not find container \"bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08\": container with ID starting with bb9bc2fc87e3eb855e3fd1d9b099c82cecfcad4123027966b9a7b9ae45093e08 not found: ID does not exist" Sep 30 19:03:48 crc kubenswrapper[4747]: I0930 19:03:48.141211 4747 scope.go:117] "RemoveContainer" containerID="f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506" Sep 30 19:03:48 crc kubenswrapper[4747]: E0930 19:03:48.141890 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506\": container with ID starting with f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506 not found: ID does not exist" containerID="f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506" Sep 30 19:03:48 crc kubenswrapper[4747]: I0930 19:03:48.141968 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506"} err="failed to get container status \"f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506\": rpc error: code = NotFound desc = could not find container \"f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506\": container with ID starting with f9b53827a0b34545514ff9c659fe73d07dc26d12efd835b72d1cbd2aa73dc506 not found: ID does not exist" Sep 30 19:03:50 crc kubenswrapper[4747]: I0930 19:03:50.912318 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-s8xfv"] Sep 30 19:03:50 crc kubenswrapper[4747]: E0930 19:03:50.912818 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" containerName="init" Sep 30 19:03:50 crc kubenswrapper[4747]: I0930 19:03:50.912843 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" containerName="init" Sep 30 19:03:50 crc kubenswrapper[4747]: E0930 19:03:50.912870 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" containerName="dnsmasq-dns" Sep 30 19:03:50 crc kubenswrapper[4747]: I0930 19:03:50.912881 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" containerName="dnsmasq-dns" Sep 30 19:03:50 crc kubenswrapper[4747]: I0930 19:03:50.913171 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b1d501-7cf4-4b68-ab10-e38d21256aac" containerName="dnsmasq-dns" Sep 30 19:03:50 crc kubenswrapper[4747]: I0930 19:03:50.914235 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s8xfv" Sep 30 19:03:50 crc kubenswrapper[4747]: I0930 19:03:50.924065 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s8xfv"] Sep 30 19:03:50 crc kubenswrapper[4747]: I0930 19:03:50.956816 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs54q\" (UniqueName: \"kubernetes.io/projected/41949c60-b7d7-4810-8c5e-67a739ec4f17-kube-api-access-vs54q\") pod \"glance-db-create-s8xfv\" (UID: \"41949c60-b7d7-4810-8c5e-67a739ec4f17\") " pod="openstack/glance-db-create-s8xfv" Sep 30 19:03:51 crc kubenswrapper[4747]: I0930 19:03:51.058850 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs54q\" (UniqueName: \"kubernetes.io/projected/41949c60-b7d7-4810-8c5e-67a739ec4f17-kube-api-access-vs54q\") pod \"glance-db-create-s8xfv\" (UID: \"41949c60-b7d7-4810-8c5e-67a739ec4f17\") " pod="openstack/glance-db-create-s8xfv" Sep 30 19:03:51 crc kubenswrapper[4747]: I0930 19:03:51.080166 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs54q\" (UniqueName: \"kubernetes.io/projected/41949c60-b7d7-4810-8c5e-67a739ec4f17-kube-api-access-vs54q\") pod \"glance-db-create-s8xfv\" (UID: \"41949c60-b7d7-4810-8c5e-67a739ec4f17\") " pod="openstack/glance-db-create-s8xfv" Sep 30 19:03:51 crc kubenswrapper[4747]: I0930 19:03:51.256608 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s8xfv" Sep 30 19:03:51 crc kubenswrapper[4747]: I0930 19:03:51.703884 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s8xfv"] Sep 30 19:03:51 crc kubenswrapper[4747]: W0930 19:03:51.709917 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41949c60_b7d7_4810_8c5e_67a739ec4f17.slice/crio-b797cfdffb5f5fc945849502f44409e8514eceaa1efb53ca2aa149611c03b0e3 WatchSource:0}: Error finding container b797cfdffb5f5fc945849502f44409e8514eceaa1efb53ca2aa149611c03b0e3: Status 404 returned error can't find the container with id b797cfdffb5f5fc945849502f44409e8514eceaa1efb53ca2aa149611c03b0e3 Sep 30 19:03:52 crc kubenswrapper[4747]: I0930 19:03:52.654173 4747 generic.go:334] "Generic (PLEG): container finished" podID="41949c60-b7d7-4810-8c5e-67a739ec4f17" containerID="987dccbdaae285fe943b3af90192e9ef8de35690eadcbdfce774c27d5b9c25b7" exitCode=0 Sep 30 19:03:52 crc kubenswrapper[4747]: I0930 19:03:52.654282 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s8xfv" event={"ID":"41949c60-b7d7-4810-8c5e-67a739ec4f17","Type":"ContainerDied","Data":"987dccbdaae285fe943b3af90192e9ef8de35690eadcbdfce774c27d5b9c25b7"} Sep 30 19:03:52 crc kubenswrapper[4747]: I0930 19:03:52.654524 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s8xfv" event={"ID":"41949c60-b7d7-4810-8c5e-67a739ec4f17","Type":"ContainerStarted","Data":"b797cfdffb5f5fc945849502f44409e8514eceaa1efb53ca2aa149611c03b0e3"} Sep 30 19:03:54 crc kubenswrapper[4747]: I0930 19:03:54.088057 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s8xfv" Sep 30 19:03:54 crc kubenswrapper[4747]: I0930 19:03:54.227094 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs54q\" (UniqueName: \"kubernetes.io/projected/41949c60-b7d7-4810-8c5e-67a739ec4f17-kube-api-access-vs54q\") pod \"41949c60-b7d7-4810-8c5e-67a739ec4f17\" (UID: \"41949c60-b7d7-4810-8c5e-67a739ec4f17\") " Sep 30 19:03:54 crc kubenswrapper[4747]: I0930 19:03:54.234467 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41949c60-b7d7-4810-8c5e-67a739ec4f17-kube-api-access-vs54q" (OuterVolumeSpecName: "kube-api-access-vs54q") pod "41949c60-b7d7-4810-8c5e-67a739ec4f17" (UID: "41949c60-b7d7-4810-8c5e-67a739ec4f17"). InnerVolumeSpecName "kube-api-access-vs54q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:54 crc kubenswrapper[4747]: I0930 19:03:54.329519 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs54q\" (UniqueName: \"kubernetes.io/projected/41949c60-b7d7-4810-8c5e-67a739ec4f17-kube-api-access-vs54q\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:54 crc kubenswrapper[4747]: I0930 19:03:54.678730 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s8xfv" event={"ID":"41949c60-b7d7-4810-8c5e-67a739ec4f17","Type":"ContainerDied","Data":"b797cfdffb5f5fc945849502f44409e8514eceaa1efb53ca2aa149611c03b0e3"} Sep 30 19:03:54 crc kubenswrapper[4747]: I0930 19:03:54.678818 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s8xfv" Sep 30 19:03:54 crc kubenswrapper[4747]: I0930 19:03:54.678849 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b797cfdffb5f5fc945849502f44409e8514eceaa1efb53ca2aa149611c03b0e3" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.186856 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9cn8j"] Sep 30 19:03:55 crc kubenswrapper[4747]: E0930 19:03:55.187339 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41949c60-b7d7-4810-8c5e-67a739ec4f17" containerName="mariadb-database-create" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.187358 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="41949c60-b7d7-4810-8c5e-67a739ec4f17" containerName="mariadb-database-create" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.187574 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="41949c60-b7d7-4810-8c5e-67a739ec4f17" containerName="mariadb-database-create" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.188422 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9cn8j" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.196185 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9cn8j"] Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.213325 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.239476 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.348900 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7w7m\" (UniqueName: \"kubernetes.io/projected/ab203f58-7cb7-497d-9a9b-201f39531e63-kube-api-access-q7w7m\") pod \"keystone-db-create-9cn8j\" (UID: \"ab203f58-7cb7-497d-9a9b-201f39531e63\") " pod="openstack/keystone-db-create-9cn8j" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.453087 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7w7m\" (UniqueName: \"kubernetes.io/projected/ab203f58-7cb7-497d-9a9b-201f39531e63-kube-api-access-q7w7m\") pod \"keystone-db-create-9cn8j\" (UID: \"ab203f58-7cb7-497d-9a9b-201f39531e63\") " pod="openstack/keystone-db-create-9cn8j" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.480175 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7w7m\" (UniqueName: \"kubernetes.io/projected/ab203f58-7cb7-497d-9a9b-201f39531e63-kube-api-access-q7w7m\") pod \"keystone-db-create-9cn8j\" (UID: \"ab203f58-7cb7-497d-9a9b-201f39531e63\") " pod="openstack/keystone-db-create-9cn8j" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.491130 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ndvkc"] Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.492765 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ndvkc" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.495184 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ndvkc"] Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.525006 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9cn8j" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.657454 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjmg\" (UniqueName: \"kubernetes.io/projected/e3eb9234-0905-4ab0-ae40-5829d01890aa-kube-api-access-fpjmg\") pod \"placement-db-create-ndvkc\" (UID: \"e3eb9234-0905-4ab0-ae40-5829d01890aa\") " pod="openstack/placement-db-create-ndvkc" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.758992 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjmg\" (UniqueName: \"kubernetes.io/projected/e3eb9234-0905-4ab0-ae40-5829d01890aa-kube-api-access-fpjmg\") pod \"placement-db-create-ndvkc\" (UID: \"e3eb9234-0905-4ab0-ae40-5829d01890aa\") " pod="openstack/placement-db-create-ndvkc" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.779982 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjmg\" (UniqueName: \"kubernetes.io/projected/e3eb9234-0905-4ab0-ae40-5829d01890aa-kube-api-access-fpjmg\") pod \"placement-db-create-ndvkc\" (UID: \"e3eb9234-0905-4ab0-ae40-5829d01890aa\") " pod="openstack/placement-db-create-ndvkc" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.826419 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ndvkc" Sep 30 19:03:55 crc kubenswrapper[4747]: I0930 19:03:55.990285 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9cn8j"] Sep 30 19:03:56 crc kubenswrapper[4747]: W0930 19:03:56.001643 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab203f58_7cb7_497d_9a9b_201f39531e63.slice/crio-44cd90b6e027474533d3468ff596002a57bc4ae68221e362549546f29420bf4e WatchSource:0}: Error finding container 44cd90b6e027474533d3468ff596002a57bc4ae68221e362549546f29420bf4e: Status 404 returned error can't find the container with id 44cd90b6e027474533d3468ff596002a57bc4ae68221e362549546f29420bf4e Sep 30 19:03:56 crc kubenswrapper[4747]: W0930 19:03:56.307890 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3eb9234_0905_4ab0_ae40_5829d01890aa.slice/crio-61a0e410daa6b30235b0c7198a82d922e39dec41d33d168e7b5749acd336b0f5 WatchSource:0}: Error finding container 61a0e410daa6b30235b0c7198a82d922e39dec41d33d168e7b5749acd336b0f5: Status 404 returned error can't find the container with id 61a0e410daa6b30235b0c7198a82d922e39dec41d33d168e7b5749acd336b0f5 Sep 30 19:03:56 crc kubenswrapper[4747]: I0930 19:03:56.308101 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ndvkc"] Sep 30 19:03:56 crc kubenswrapper[4747]: I0930 19:03:56.716264 4747 generic.go:334] "Generic (PLEG): container finished" podID="e3eb9234-0905-4ab0-ae40-5829d01890aa" containerID="5c3efdef7c3c30deb2ac09b34ed58a1a6b899b49e34d3078d56ec0199261dae5" exitCode=0 Sep 30 19:03:56 crc kubenswrapper[4747]: I0930 19:03:56.716368 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ndvkc" event={"ID":"e3eb9234-0905-4ab0-ae40-5829d01890aa","Type":"ContainerDied","Data":"5c3efdef7c3c30deb2ac09b34ed58a1a6b899b49e34d3078d56ec0199261dae5"} Sep 30 19:03:56 crc kubenswrapper[4747]: I0930 19:03:56.716411 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ndvkc" event={"ID":"e3eb9234-0905-4ab0-ae40-5829d01890aa","Type":"ContainerStarted","Data":"61a0e410daa6b30235b0c7198a82d922e39dec41d33d168e7b5749acd336b0f5"} Sep 30 19:03:56 crc kubenswrapper[4747]: I0930 19:03:56.718843 4747 generic.go:334] "Generic (PLEG): container finished" podID="ab203f58-7cb7-497d-9a9b-201f39531e63" containerID="0a42e01b990fa64d8599727fb18de6d3705d4a3499aedd36c074216640c98176" exitCode=0 Sep 30 19:03:56 crc kubenswrapper[4747]: I0930 19:03:56.718914 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9cn8j" event={"ID":"ab203f58-7cb7-497d-9a9b-201f39531e63","Type":"ContainerDied","Data":"0a42e01b990fa64d8599727fb18de6d3705d4a3499aedd36c074216640c98176"} Sep 30 19:03:56 crc kubenswrapper[4747]: I0930 19:03:56.719030 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9cn8j" event={"ID":"ab203f58-7cb7-497d-9a9b-201f39531e63","Type":"ContainerStarted","Data":"44cd90b6e027474533d3468ff596002a57bc4ae68221e362549546f29420bf4e"} Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.199396 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ndvkc" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.206321 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9cn8j" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.308203 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpjmg\" (UniqueName: \"kubernetes.io/projected/e3eb9234-0905-4ab0-ae40-5829d01890aa-kube-api-access-fpjmg\") pod \"e3eb9234-0905-4ab0-ae40-5829d01890aa\" (UID: \"e3eb9234-0905-4ab0-ae40-5829d01890aa\") " Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.309434 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7w7m\" (UniqueName: \"kubernetes.io/projected/ab203f58-7cb7-497d-9a9b-201f39531e63-kube-api-access-q7w7m\") pod \"ab203f58-7cb7-497d-9a9b-201f39531e63\" (UID: \"ab203f58-7cb7-497d-9a9b-201f39531e63\") " Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.317103 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab203f58-7cb7-497d-9a9b-201f39531e63-kube-api-access-q7w7m" (OuterVolumeSpecName: "kube-api-access-q7w7m") pod "ab203f58-7cb7-497d-9a9b-201f39531e63" (UID: "ab203f58-7cb7-497d-9a9b-201f39531e63"). InnerVolumeSpecName "kube-api-access-q7w7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.320149 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3eb9234-0905-4ab0-ae40-5829d01890aa-kube-api-access-fpjmg" (OuterVolumeSpecName: "kube-api-access-fpjmg") pod "e3eb9234-0905-4ab0-ae40-5829d01890aa" (UID: "e3eb9234-0905-4ab0-ae40-5829d01890aa"). InnerVolumeSpecName "kube-api-access-fpjmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.411760 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpjmg\" (UniqueName: \"kubernetes.io/projected/e3eb9234-0905-4ab0-ae40-5829d01890aa-kube-api-access-fpjmg\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.411795 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7w7m\" (UniqueName: \"kubernetes.io/projected/ab203f58-7cb7-497d-9a9b-201f39531e63-kube-api-access-q7w7m\") on node \"crc\" DevicePath \"\"" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.743152 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9cn8j" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.743152 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9cn8j" event={"ID":"ab203f58-7cb7-497d-9a9b-201f39531e63","Type":"ContainerDied","Data":"44cd90b6e027474533d3468ff596002a57bc4ae68221e362549546f29420bf4e"} Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.743353 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44cd90b6e027474533d3468ff596002a57bc4ae68221e362549546f29420bf4e" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.745552 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ndvkc" event={"ID":"e3eb9234-0905-4ab0-ae40-5829d01890aa","Type":"ContainerDied","Data":"61a0e410daa6b30235b0c7198a82d922e39dec41d33d168e7b5749acd336b0f5"} Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.745599 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ndvkc" Sep 30 19:03:58 crc kubenswrapper[4747]: I0930 19:03:58.745608 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61a0e410daa6b30235b0c7198a82d922e39dec41d33d168e7b5749acd336b0f5" Sep 30 19:04:00 crc kubenswrapper[4747]: I0930 19:04:00.975205 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-96e4-account-create-qbklj"] Sep 30 19:04:00 crc kubenswrapper[4747]: E0930 19:04:00.977861 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab203f58-7cb7-497d-9a9b-201f39531e63" containerName="mariadb-database-create" Sep 30 19:04:00 crc kubenswrapper[4747]: I0930 19:04:00.978136 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab203f58-7cb7-497d-9a9b-201f39531e63" containerName="mariadb-database-create" Sep 30 19:04:00 crc kubenswrapper[4747]: E0930 19:04:00.978340 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3eb9234-0905-4ab0-ae40-5829d01890aa" containerName="mariadb-database-create" Sep 30 19:04:00 crc kubenswrapper[4747]: I0930 19:04:00.978516 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3eb9234-0905-4ab0-ae40-5829d01890aa" containerName="mariadb-database-create" Sep 30 19:04:00 crc kubenswrapper[4747]: I0930 19:04:00.979158 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3eb9234-0905-4ab0-ae40-5829d01890aa" containerName="mariadb-database-create" Sep 30 19:04:00 crc kubenswrapper[4747]: I0930 19:04:00.979361 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab203f58-7cb7-497d-9a9b-201f39531e63" containerName="mariadb-database-create" Sep 30 19:04:00 crc kubenswrapper[4747]: I0930 19:04:00.980869 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-96e4-account-create-qbklj" Sep 30 19:04:00 crc kubenswrapper[4747]: I0930 19:04:00.985024 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 19:04:00 crc kubenswrapper[4747]: I0930 19:04:00.990087 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-96e4-account-create-qbklj"] Sep 30 19:04:01 crc kubenswrapper[4747]: I0930 19:04:01.067494 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbd9l\" (UniqueName: \"kubernetes.io/projected/7c7f8831-1689-4327-a4ce-ff770e5840ec-kube-api-access-hbd9l\") pod \"glance-96e4-account-create-qbklj\" (UID: \"7c7f8831-1689-4327-a4ce-ff770e5840ec\") " pod="openstack/glance-96e4-account-create-qbklj" Sep 30 19:04:01 crc kubenswrapper[4747]: I0930 19:04:01.170024 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbd9l\" (UniqueName: \"kubernetes.io/projected/7c7f8831-1689-4327-a4ce-ff770e5840ec-kube-api-access-hbd9l\") pod \"glance-96e4-account-create-qbklj\" (UID: \"7c7f8831-1689-4327-a4ce-ff770e5840ec\") " pod="openstack/glance-96e4-account-create-qbklj" Sep 30 19:04:01 crc kubenswrapper[4747]: I0930 19:04:01.193634 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbd9l\" (UniqueName: \"kubernetes.io/projected/7c7f8831-1689-4327-a4ce-ff770e5840ec-kube-api-access-hbd9l\") pod \"glance-96e4-account-create-qbklj\" (UID: \"7c7f8831-1689-4327-a4ce-ff770e5840ec\") " pod="openstack/glance-96e4-account-create-qbklj" Sep 30 19:04:01 crc kubenswrapper[4747]: I0930 19:04:01.317999 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-96e4-account-create-qbklj" Sep 30 19:04:01 crc kubenswrapper[4747]: I0930 19:04:01.645052 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-96e4-account-create-qbklj"] Sep 30 19:04:01 crc kubenswrapper[4747]: W0930 19:04:01.653245 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c7f8831_1689_4327_a4ce_ff770e5840ec.slice/crio-034364f14aa97e05e00de594e469126c9148b7eb1c99ca67a0b14cba767c81b1 WatchSource:0}: Error finding container 034364f14aa97e05e00de594e469126c9148b7eb1c99ca67a0b14cba767c81b1: Status 404 returned error can't find the container with id 034364f14aa97e05e00de594e469126c9148b7eb1c99ca67a0b14cba767c81b1 Sep 30 19:04:01 crc kubenswrapper[4747]: I0930 19:04:01.771671 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-96e4-account-create-qbklj" event={"ID":"7c7f8831-1689-4327-a4ce-ff770e5840ec","Type":"ContainerStarted","Data":"034364f14aa97e05e00de594e469126c9148b7eb1c99ca67a0b14cba767c81b1"} Sep 30 19:04:02 crc kubenswrapper[4747]: I0930 19:04:02.783347 4747 generic.go:334] "Generic (PLEG): container finished" podID="7c7f8831-1689-4327-a4ce-ff770e5840ec" containerID="29c6d379cc86e0b0977c216ea3552b824ae6bfbd0fa4d87b89a756788d7a4fa6" exitCode=0 Sep 30 19:04:02 crc kubenswrapper[4747]: I0930 19:04:02.783633 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-96e4-account-create-qbklj" event={"ID":"7c7f8831-1689-4327-a4ce-ff770e5840ec","Type":"ContainerDied","Data":"29c6d379cc86e0b0977c216ea3552b824ae6bfbd0fa4d87b89a756788d7a4fa6"} Sep 30 19:04:03 crc kubenswrapper[4747]: I0930 19:04:03.799113 4747 generic.go:334] "Generic (PLEG): container finished" podID="008f0030-d04c-427c-bc09-76874ee17b16" containerID="eb4b076070fdd02a8029a6b18c6ba95aa0b2e9b34e79b3c11b8497047b5c2f6e" exitCode=0 Sep 30 19:04:03 crc kubenswrapper[4747]: I0930 19:04:03.799186 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"008f0030-d04c-427c-bc09-76874ee17b16","Type":"ContainerDied","Data":"eb4b076070fdd02a8029a6b18c6ba95aa0b2e9b34e79b3c11b8497047b5c2f6e"} Sep 30 19:04:03 crc kubenswrapper[4747]: I0930 19:04:03.803139 4747 generic.go:334] "Generic (PLEG): container finished" podID="2ae06b6d-5f75-46a4-8805-0d99f3771c71" containerID="a69a5be0b03ee2fbce9b63b332d509f363b3231f62fe0ed5156cadc2473dd792" exitCode=0 Sep 30 19:04:03 crc kubenswrapper[4747]: I0930 19:04:03.803757 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ae06b6d-5f75-46a4-8805-0d99f3771c71","Type":"ContainerDied","Data":"a69a5be0b03ee2fbce9b63b332d509f363b3231f62fe0ed5156cadc2473dd792"} Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.216145 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-96e4-account-create-qbklj" Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.334402 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbd9l\" (UniqueName: \"kubernetes.io/projected/7c7f8831-1689-4327-a4ce-ff770e5840ec-kube-api-access-hbd9l\") pod \"7c7f8831-1689-4327-a4ce-ff770e5840ec\" (UID: \"7c7f8831-1689-4327-a4ce-ff770e5840ec\") " Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.341200 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7f8831-1689-4327-a4ce-ff770e5840ec-kube-api-access-hbd9l" (OuterVolumeSpecName: "kube-api-access-hbd9l") pod "7c7f8831-1689-4327-a4ce-ff770e5840ec" (UID: "7c7f8831-1689-4327-a4ce-ff770e5840ec"). InnerVolumeSpecName "kube-api-access-hbd9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.437114 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbd9l\" (UniqueName: \"kubernetes.io/projected/7c7f8831-1689-4327-a4ce-ff770e5840ec-kube-api-access-hbd9l\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.818604 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ae06b6d-5f75-46a4-8805-0d99f3771c71","Type":"ContainerStarted","Data":"4b92196a40822aa1147f873ffbc547a298470ab6499ff2db1c8a06f0b3e365a4"} Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.819064 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.821865 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-96e4-account-create-qbklj" event={"ID":"7c7f8831-1689-4327-a4ce-ff770e5840ec","Type":"ContainerDied","Data":"034364f14aa97e05e00de594e469126c9148b7eb1c99ca67a0b14cba767c81b1"} Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.821913 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="034364f14aa97e05e00de594e469126c9148b7eb1c99ca67a0b14cba767c81b1" Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.822360 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-96e4-account-create-qbklj" Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.825686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"008f0030-d04c-427c-bc09-76874ee17b16","Type":"ContainerStarted","Data":"cf8eeff8f547394a28d2a3cbcffb46920403487e2b1021dab8de517c48008354"} Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.826173 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.855507 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.040453863 podStartE2EDuration="53.855484732s" podCreationTimestamp="2025-09-30 19:03:11 +0000 UTC" firstStartedPulling="2025-09-30 19:03:23.725988334 +0000 UTC m=+1043.385468448" lastFinishedPulling="2025-09-30 19:03:32.541019193 +0000 UTC m=+1052.200499317" observedRunningTime="2025-09-30 19:04:04.852318101 +0000 UTC m=+1084.511798245" watchObservedRunningTime="2025-09-30 19:04:04.855484732 +0000 UTC m=+1084.514964876" Sep 30 19:04:04 crc kubenswrapper[4747]: I0930 19:04:04.892528 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.208974305 podStartE2EDuration="54.89245821s" podCreationTimestamp="2025-09-30 19:03:10 +0000 UTC" firstStartedPulling="2025-09-30 19:03:23.783047027 +0000 UTC m=+1043.442527141" lastFinishedPulling="2025-09-30 19:03:32.466530922 +0000 UTC m=+1052.126011046" observedRunningTime="2025-09-30 19:04:04.888457435 +0000 UTC m=+1084.547937549" watchObservedRunningTime="2025-09-30 19:04:04.89245821 +0000 UTC m=+1084.551938364" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.316854 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a085-account-create-t62dz"] Sep 30 19:04:05 crc kubenswrapper[4747]: E0930 19:04:05.317174 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7f8831-1689-4327-a4ce-ff770e5840ec" containerName="mariadb-account-create" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.317188 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7f8831-1689-4327-a4ce-ff770e5840ec" containerName="mariadb-account-create" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.317411 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7f8831-1689-4327-a4ce-ff770e5840ec" containerName="mariadb-account-create" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.318010 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a085-account-create-t62dz" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.321245 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.328822 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a085-account-create-t62dz"] Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.456362 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mldp2\" (UniqueName: \"kubernetes.io/projected/1d0e13c0-006b-4a57-853f-2e0cc942319a-kube-api-access-mldp2\") pod \"keystone-a085-account-create-t62dz\" (UID: \"1d0e13c0-006b-4a57-853f-2e0cc942319a\") " pod="openstack/keystone-a085-account-create-t62dz" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.558705 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mldp2\" (UniqueName: \"kubernetes.io/projected/1d0e13c0-006b-4a57-853f-2e0cc942319a-kube-api-access-mldp2\") pod \"keystone-a085-account-create-t62dz\" (UID: \"1d0e13c0-006b-4a57-853f-2e0cc942319a\") " pod="openstack/keystone-a085-account-create-t62dz" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.601666 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mldp2\" (UniqueName: \"kubernetes.io/projected/1d0e13c0-006b-4a57-853f-2e0cc942319a-kube-api-access-mldp2\") pod \"keystone-a085-account-create-t62dz\" (UID: \"1d0e13c0-006b-4a57-853f-2e0cc942319a\") " pod="openstack/keystone-a085-account-create-t62dz" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.612638 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2a30-account-create-jbfsq"] Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.614129 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a30-account-create-jbfsq" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.622530 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.627563 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2a30-account-create-jbfsq"] Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.661329 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2v7n\" (UniqueName: \"kubernetes.io/projected/2cf39398-6e15-4170-9caa-b4a73cca0a46-kube-api-access-w2v7n\") pod \"placement-2a30-account-create-jbfsq\" (UID: \"2cf39398-6e15-4170-9caa-b4a73cca0a46\") " pod="openstack/placement-2a30-account-create-jbfsq" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.679593 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a085-account-create-t62dz" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.763034 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2v7n\" (UniqueName: \"kubernetes.io/projected/2cf39398-6e15-4170-9caa-b4a73cca0a46-kube-api-access-w2v7n\") pod \"placement-2a30-account-create-jbfsq\" (UID: \"2cf39398-6e15-4170-9caa-b4a73cca0a46\") " pod="openstack/placement-2a30-account-create-jbfsq" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.783901 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2v7n\" (UniqueName: \"kubernetes.io/projected/2cf39398-6e15-4170-9caa-b4a73cca0a46-kube-api-access-w2v7n\") pod \"placement-2a30-account-create-jbfsq\" (UID: \"2cf39398-6e15-4170-9caa-b4a73cca0a46\") " pod="openstack/placement-2a30-account-create-jbfsq" Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.955397 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a085-account-create-t62dz"] Sep 30 19:04:05 crc kubenswrapper[4747]: I0930 19:04:05.966836 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a30-account-create-jbfsq" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.125702 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xnvtm"] Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.126893 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.130101 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.130355 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wm2bc" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.135965 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xnvtm"] Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.172214 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-config-data\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.172564 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-combined-ca-bundle\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.172604 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwdm\" (UniqueName: \"kubernetes.io/projected/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-kube-api-access-vcwdm\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.172694 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-db-sync-config-data\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.274366 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-config-data\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.274436 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-combined-ca-bundle\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.274468 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwdm\" (UniqueName: \"kubernetes.io/projected/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-kube-api-access-vcwdm\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.274546 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-db-sync-config-data\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.280976 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-db-sync-config-data\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.281077 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-combined-ca-bundle\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.288094 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-config-data\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.290411 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwdm\" (UniqueName: \"kubernetes.io/projected/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-kube-api-access-vcwdm\") pod \"glance-db-sync-xnvtm\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.463182 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.465474 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2a30-account-create-jbfsq"] Sep 30 19:04:06 crc kubenswrapper[4747]: W0930 19:04:06.489324 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cf39398_6e15_4170_9caa_b4a73cca0a46.slice/crio-a4f0ffa122a0f6fb58d563e3da265a4144486b8a811e7eaa57041c59f6a4e654 WatchSource:0}: Error finding container a4f0ffa122a0f6fb58d563e3da265a4144486b8a811e7eaa57041c59f6a4e654: Status 404 returned error can't find the container with id a4f0ffa122a0f6fb58d563e3da265a4144486b8a811e7eaa57041c59f6a4e654 Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.816133 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xgjfm" podUID="7211e36c-ce8d-434e-9952-e5f5eb7097ec" containerName="ovn-controller" probeResult="failure" output=< Sep 30 19:04:06 crc kubenswrapper[4747]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 19:04:06 crc kubenswrapper[4747]: > Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.817676 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.819944 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9nf99" Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.847710 4747 generic.go:334] "Generic (PLEG): container finished" podID="2cf39398-6e15-4170-9caa-b4a73cca0a46" containerID="53ef9f7f2354a088199bd3d70ce65e283dd3d75aa068387ff9c07d8c2bea465d" exitCode=0 Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.847780 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a30-account-create-jbfsq" event={"ID":"2cf39398-6e15-4170-9caa-b4a73cca0a46","Type":"ContainerDied","Data":"53ef9f7f2354a088199bd3d70ce65e283dd3d75aa068387ff9c07d8c2bea465d"} Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.847805 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a30-account-create-jbfsq" event={"ID":"2cf39398-6e15-4170-9caa-b4a73cca0a46","Type":"ContainerStarted","Data":"a4f0ffa122a0f6fb58d563e3da265a4144486b8a811e7eaa57041c59f6a4e654"} Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.851598 4747 generic.go:334] "Generic (PLEG): container finished" podID="1d0e13c0-006b-4a57-853f-2e0cc942319a" containerID="21eab6f41fbe85fcf87a21d10eed0608c9dfc6ee065b6b32e058b30e3c4add5e" exitCode=0 Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.851655 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a085-account-create-t62dz" event={"ID":"1d0e13c0-006b-4a57-853f-2e0cc942319a","Type":"ContainerDied","Data":"21eab6f41fbe85fcf87a21d10eed0608c9dfc6ee065b6b32e058b30e3c4add5e"} Sep 30 19:04:06 crc kubenswrapper[4747]: I0930 19:04:06.851706 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a085-account-create-t62dz" event={"ID":"1d0e13c0-006b-4a57-853f-2e0cc942319a","Type":"ContainerStarted","Data":"f0565bee8d623dfc400f48538e4822c24d57fce247390b32cb7a6ac9bc2adf7e"} Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.043331 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xgjfm-config-87lnq"] Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.044286 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.047180 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.072762 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xgjfm-config-87lnq"] Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.080688 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xnvtm"] Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.098310 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7nxm\" (UniqueName: \"kubernetes.io/projected/b6e9dc01-ace7-4286-9a00-61f55172896e-kube-api-access-t7nxm\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.098392 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-scripts\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.098479 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.098658 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-additional-scripts\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.098708 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-log-ovn\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.098899 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run-ovn\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: W0930 19:04:07.106735 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10f3e5f1_0b03_499d_aa2e_1efa9d8c2154.slice/crio-bd361ad2382799ab709c2f4135e4834e4c0b9cc63394d2e33c244632d2525f37 WatchSource:0}: Error finding container bd361ad2382799ab709c2f4135e4834e4c0b9cc63394d2e33c244632d2525f37: Status 404 returned error can't find the container with id bd361ad2382799ab709c2f4135e4834e4c0b9cc63394d2e33c244632d2525f37 Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.200200 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run-ovn\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.200783 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7nxm\" (UniqueName: \"kubernetes.io/projected/b6e9dc01-ace7-4286-9a00-61f55172896e-kube-api-access-t7nxm\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.200944 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-scripts\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.200973 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.201093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-additional-scripts\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.201153 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-log-ovn\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.201340 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-log-ovn\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.200626 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run-ovn\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.202816 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.203789 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-additional-scripts\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.205750 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-scripts\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.227236 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7nxm\" (UniqueName: \"kubernetes.io/projected/b6e9dc01-ace7-4286-9a00-61f55172896e-kube-api-access-t7nxm\") pod \"ovn-controller-xgjfm-config-87lnq\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.363300 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.655907 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.656397 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.699400 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xgjfm-config-87lnq"] Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.863516 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnvtm" event={"ID":"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154","Type":"ContainerStarted","Data":"bd361ad2382799ab709c2f4135e4834e4c0b9cc63394d2e33c244632d2525f37"} Sep 30 19:04:07 crc kubenswrapper[4747]: I0930 19:04:07.864891 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xgjfm-config-87lnq" event={"ID":"b6e9dc01-ace7-4286-9a00-61f55172896e","Type":"ContainerStarted","Data":"6e6de977ca0309a15236bc19faa45f37c27ec36609ec6445b3eac893a7a8170a"} Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.240845 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a30-account-create-jbfsq" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.316851 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2v7n\" (UniqueName: \"kubernetes.io/projected/2cf39398-6e15-4170-9caa-b4a73cca0a46-kube-api-access-w2v7n\") pod \"2cf39398-6e15-4170-9caa-b4a73cca0a46\" (UID: \"2cf39398-6e15-4170-9caa-b4a73cca0a46\") " Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.323593 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf39398-6e15-4170-9caa-b4a73cca0a46-kube-api-access-w2v7n" (OuterVolumeSpecName: "kube-api-access-w2v7n") pod "2cf39398-6e15-4170-9caa-b4a73cca0a46" (UID: "2cf39398-6e15-4170-9caa-b4a73cca0a46"). InnerVolumeSpecName "kube-api-access-w2v7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.324208 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a085-account-create-t62dz" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.417908 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mldp2\" (UniqueName: \"kubernetes.io/projected/1d0e13c0-006b-4a57-853f-2e0cc942319a-kube-api-access-mldp2\") pod \"1d0e13c0-006b-4a57-853f-2e0cc942319a\" (UID: \"1d0e13c0-006b-4a57-853f-2e0cc942319a\") " Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.418321 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2v7n\" (UniqueName: \"kubernetes.io/projected/2cf39398-6e15-4170-9caa-b4a73cca0a46-kube-api-access-w2v7n\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.421596 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0e13c0-006b-4a57-853f-2e0cc942319a-kube-api-access-mldp2" (OuterVolumeSpecName: "kube-api-access-mldp2") pod "1d0e13c0-006b-4a57-853f-2e0cc942319a" (UID: "1d0e13c0-006b-4a57-853f-2e0cc942319a"). InnerVolumeSpecName "kube-api-access-mldp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.519461 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mldp2\" (UniqueName: \"kubernetes.io/projected/1d0e13c0-006b-4a57-853f-2e0cc942319a-kube-api-access-mldp2\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.883140 4747 generic.go:334] "Generic (PLEG): container finished" podID="b6e9dc01-ace7-4286-9a00-61f55172896e" containerID="0b7e3684ca04ab67e4c4a80052fedafbed4b74a50fc994f4be598431326e2225" exitCode=0 Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.883210 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xgjfm-config-87lnq" event={"ID":"b6e9dc01-ace7-4286-9a00-61f55172896e","Type":"ContainerDied","Data":"0b7e3684ca04ab67e4c4a80052fedafbed4b74a50fc994f4be598431326e2225"} Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.889204 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a30-account-create-jbfsq" event={"ID":"2cf39398-6e15-4170-9caa-b4a73cca0a46","Type":"ContainerDied","Data":"a4f0ffa122a0f6fb58d563e3da265a4144486b8a811e7eaa57041c59f6a4e654"} Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.889239 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f0ffa122a0f6fb58d563e3da265a4144486b8a811e7eaa57041c59f6a4e654" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.889242 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a30-account-create-jbfsq" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.891042 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a085-account-create-t62dz" event={"ID":"1d0e13c0-006b-4a57-853f-2e0cc942319a","Type":"ContainerDied","Data":"f0565bee8d623dfc400f48538e4822c24d57fce247390b32cb7a6ac9bc2adf7e"} Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.891069 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0565bee8d623dfc400f48538e4822c24d57fce247390b32cb7a6ac9bc2adf7e" Sep 30 19:04:08 crc kubenswrapper[4747]: I0930 19:04:08.891077 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a085-account-create-t62dz" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.202157 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.247322 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7nxm\" (UniqueName: \"kubernetes.io/projected/b6e9dc01-ace7-4286-9a00-61f55172896e-kube-api-access-t7nxm\") pod \"b6e9dc01-ace7-4286-9a00-61f55172896e\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.247383 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-scripts\") pod \"b6e9dc01-ace7-4286-9a00-61f55172896e\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.247425 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run-ovn\") pod \"b6e9dc01-ace7-4286-9a00-61f55172896e\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.247545 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b6e9dc01-ace7-4286-9a00-61f55172896e" (UID: "b6e9dc01-ace7-4286-9a00-61f55172896e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.247585 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run\") pod \"b6e9dc01-ace7-4286-9a00-61f55172896e\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.247649 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run" (OuterVolumeSpecName: "var-run") pod "b6e9dc01-ace7-4286-9a00-61f55172896e" (UID: "b6e9dc01-ace7-4286-9a00-61f55172896e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.247731 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-additional-scripts\") pod \"b6e9dc01-ace7-4286-9a00-61f55172896e\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.248690 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-scripts" (OuterVolumeSpecName: "scripts") pod "b6e9dc01-ace7-4286-9a00-61f55172896e" (UID: "b6e9dc01-ace7-4286-9a00-61f55172896e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252123 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b6e9dc01-ace7-4286-9a00-61f55172896e" (UID: "b6e9dc01-ace7-4286-9a00-61f55172896e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252361 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-log-ovn\") pod \"b6e9dc01-ace7-4286-9a00-61f55172896e\" (UID: \"b6e9dc01-ace7-4286-9a00-61f55172896e\") " Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252434 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b6e9dc01-ace7-4286-9a00-61f55172896e" (UID: "b6e9dc01-ace7-4286-9a00-61f55172896e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252817 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e9dc01-ace7-4286-9a00-61f55172896e-kube-api-access-t7nxm" (OuterVolumeSpecName: "kube-api-access-t7nxm") pod "b6e9dc01-ace7-4286-9a00-61f55172896e" (UID: "b6e9dc01-ace7-4286-9a00-61f55172896e"). InnerVolumeSpecName "kube-api-access-t7nxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252878 4747 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252891 4747 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252900 4747 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252908 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7nxm\" (UniqueName: \"kubernetes.io/projected/b6e9dc01-ace7-4286-9a00-61f55172896e-kube-api-access-t7nxm\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252916 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e9dc01-ace7-4286-9a00-61f55172896e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.252937 4747 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e9dc01-ace7-4286-9a00-61f55172896e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.910177 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xgjfm-config-87lnq" event={"ID":"b6e9dc01-ace7-4286-9a00-61f55172896e","Type":"ContainerDied","Data":"6e6de977ca0309a15236bc19faa45f37c27ec36609ec6445b3eac893a7a8170a"} Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.910210 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xgjfm-config-87lnq" Sep 30 19:04:10 crc kubenswrapper[4747]: I0930 19:04:10.910226 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e6de977ca0309a15236bc19faa45f37c27ec36609ec6445b3eac893a7a8170a" Sep 30 19:04:11 crc kubenswrapper[4747]: I0930 19:04:11.317346 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xgjfm-config-87lnq"] Sep 30 19:04:11 crc kubenswrapper[4747]: I0930 19:04:11.321668 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xgjfm-config-87lnq"] Sep 30 19:04:11 crc kubenswrapper[4747]: I0930 19:04:11.824041 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xgjfm" Sep 30 19:04:13 crc kubenswrapper[4747]: I0930 19:04:13.100593 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e9dc01-ace7-4286-9a00-61f55172896e" path="/var/lib/kubelet/pods/b6e9dc01-ace7-4286-9a00-61f55172896e/volumes" Sep 30 19:04:18 crc kubenswrapper[4747]: I0930 19:04:18.993185 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnvtm" event={"ID":"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154","Type":"ContainerStarted","Data":"44e389b786a338dc68ff82df392cf6560c31ee2c8bc2946f421aca101f5b29ce"} Sep 30 19:04:19 crc kubenswrapper[4747]: I0930 19:04:19.012735 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xnvtm" podStartSLOduration=2.027686674 podStartE2EDuration="13.012709694s" podCreationTimestamp="2025-09-30 19:04:06 +0000 UTC" firstStartedPulling="2025-09-30 19:04:07.109139257 +0000 UTC m=+1086.768619371" lastFinishedPulling="2025-09-30 19:04:18.094162267 +0000 UTC m=+1097.753642391" observedRunningTime="2025-09-30 19:04:19.007529416 +0000 UTC m=+1098.667009570" watchObservedRunningTime="2025-09-30 19:04:19.012709694 +0000 UTC m=+1098.672189838" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.226272 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.514089 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.536645 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-svxbf"] Sep 30 19:04:22 crc kubenswrapper[4747]: E0930 19:04:22.536975 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf39398-6e15-4170-9caa-b4a73cca0a46" containerName="mariadb-account-create" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.536990 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf39398-6e15-4170-9caa-b4a73cca0a46" containerName="mariadb-account-create" Sep 30 19:04:22 crc kubenswrapper[4747]: E0930 19:04:22.537002 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0e13c0-006b-4a57-853f-2e0cc942319a" containerName="mariadb-account-create" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.537008 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0e13c0-006b-4a57-853f-2e0cc942319a" containerName="mariadb-account-create" Sep 30 19:04:22 crc kubenswrapper[4747]: E0930 19:04:22.537030 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e9dc01-ace7-4286-9a00-61f55172896e" containerName="ovn-config" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.537036 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e9dc01-ace7-4286-9a00-61f55172896e" containerName="ovn-config" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.537188 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0e13c0-006b-4a57-853f-2e0cc942319a" containerName="mariadb-account-create" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.537198 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf39398-6e15-4170-9caa-b4a73cca0a46" containerName="mariadb-account-create" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.537214 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e9dc01-ace7-4286-9a00-61f55172896e" containerName="ovn-config" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.537711 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-svxbf" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.550877 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-svxbf"] Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.570647 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2tnw\" (UniqueName: \"kubernetes.io/projected/791723cb-ce23-48fc-b941-ae01909dc4a4-kube-api-access-d2tnw\") pod \"cinder-db-create-svxbf\" (UID: \"791723cb-ce23-48fc-b941-ae01909dc4a4\") " pod="openstack/cinder-db-create-svxbf" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.671713 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2tnw\" (UniqueName: \"kubernetes.io/projected/791723cb-ce23-48fc-b941-ae01909dc4a4-kube-api-access-d2tnw\") pod \"cinder-db-create-svxbf\" (UID: \"791723cb-ce23-48fc-b941-ae01909dc4a4\") " pod="openstack/cinder-db-create-svxbf" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.691484 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2tnw\" (UniqueName: \"kubernetes.io/projected/791723cb-ce23-48fc-b941-ae01909dc4a4-kube-api-access-d2tnw\") pod \"cinder-db-create-svxbf\" (UID: \"791723cb-ce23-48fc-b941-ae01909dc4a4\") " pod="openstack/cinder-db-create-svxbf" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.749903 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-bhzpm"] Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.751132 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bhzpm" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.768671 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bhzpm"] Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.863637 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-svxbf" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.874790 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcj6c\" (UniqueName: \"kubernetes.io/projected/bdc3a0c2-38a2-41a1-90d1-992ccba7d54f-kube-api-access-gcj6c\") pod \"neutron-db-create-bhzpm\" (UID: \"bdc3a0c2-38a2-41a1-90d1-992ccba7d54f\") " pod="openstack/neutron-db-create-bhzpm" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.903094 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-w6p75"] Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.904345 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.909231 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.909381 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.909489 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.909571 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q4ln8" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.919066 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w6p75"] Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.981265 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf29g\" (UniqueName: \"kubernetes.io/projected/19b34fa2-4ead-4998-a29d-a29d16dc9aea-kube-api-access-gf29g\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.981300 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-config-data\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.981496 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcj6c\" (UniqueName: \"kubernetes.io/projected/bdc3a0c2-38a2-41a1-90d1-992ccba7d54f-kube-api-access-gcj6c\") pod \"neutron-db-create-bhzpm\" (UID: \"bdc3a0c2-38a2-41a1-90d1-992ccba7d54f\") " pod="openstack/neutron-db-create-bhzpm" Sep 30 19:04:22 crc kubenswrapper[4747]: I0930 19:04:22.981605 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-combined-ca-bundle\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.009887 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcj6c\" (UniqueName: \"kubernetes.io/projected/bdc3a0c2-38a2-41a1-90d1-992ccba7d54f-kube-api-access-gcj6c\") pod \"neutron-db-create-bhzpm\" (UID: \"bdc3a0c2-38a2-41a1-90d1-992ccba7d54f\") " pod="openstack/neutron-db-create-bhzpm" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.068201 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bhzpm" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.082595 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-combined-ca-bundle\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.082655 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf29g\" (UniqueName: \"kubernetes.io/projected/19b34fa2-4ead-4998-a29d-a29d16dc9aea-kube-api-access-gf29g\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.082679 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-config-data\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.089019 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-config-data\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.099765 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-combined-ca-bundle\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.103405 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf29g\" (UniqueName: \"kubernetes.io/projected/19b34fa2-4ead-4998-a29d-a29d16dc9aea-kube-api-access-gf29g\") pod \"keystone-db-sync-w6p75\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.151548 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-svxbf"] Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.304423 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.365883 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bhzpm"] Sep 30 19:04:23 crc kubenswrapper[4747]: W0930 19:04:23.372105 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdc3a0c2_38a2_41a1_90d1_992ccba7d54f.slice/crio-37f4239362132b6051fbfbd61c074cf0ff42675000334a140220061182b3f0f2 WatchSource:0}: Error finding container 37f4239362132b6051fbfbd61c074cf0ff42675000334a140220061182b3f0f2: Status 404 returned error can't find the container with id 37f4239362132b6051fbfbd61c074cf0ff42675000334a140220061182b3f0f2 Sep 30 19:04:23 crc kubenswrapper[4747]: I0930 19:04:23.545121 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w6p75"] Sep 30 19:04:23 crc kubenswrapper[4747]: W0930 19:04:23.549384 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19b34fa2_4ead_4998_a29d_a29d16dc9aea.slice/crio-e494b062a64e2b824c13df54905f2a627c01db8e37c5c9f2817c4da5bfd660e8 WatchSource:0}: Error finding container e494b062a64e2b824c13df54905f2a627c01db8e37c5c9f2817c4da5bfd660e8: Status 404 returned error can't find the container with id e494b062a64e2b824c13df54905f2a627c01db8e37c5c9f2817c4da5bfd660e8 Sep 30 19:04:24 crc kubenswrapper[4747]: I0930 19:04:24.061396 4747 generic.go:334] "Generic (PLEG): container finished" podID="bdc3a0c2-38a2-41a1-90d1-992ccba7d54f" containerID="05f707f4bf554b46117c9ba81bc97f84f57dbf0bc279af3831d324ad28849bdf" exitCode=0 Sep 30 19:04:24 crc kubenswrapper[4747]: I0930 19:04:24.061511 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bhzpm" event={"ID":"bdc3a0c2-38a2-41a1-90d1-992ccba7d54f","Type":"ContainerDied","Data":"05f707f4bf554b46117c9ba81bc97f84f57dbf0bc279af3831d324ad28849bdf"} Sep 30 19:04:24 crc kubenswrapper[4747]: I0930 19:04:24.061538 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bhzpm" event={"ID":"bdc3a0c2-38a2-41a1-90d1-992ccba7d54f","Type":"ContainerStarted","Data":"37f4239362132b6051fbfbd61c074cf0ff42675000334a140220061182b3f0f2"} Sep 30 19:04:24 crc kubenswrapper[4747]: I0930 19:04:24.063479 4747 generic.go:334] "Generic (PLEG): container finished" podID="791723cb-ce23-48fc-b941-ae01909dc4a4" containerID="f94116fc3f08a21c17ccaad0950a285bf4b9d0c575cb327eeb29c2d4fda74417" exitCode=0 Sep 30 19:04:24 crc kubenswrapper[4747]: I0930 19:04:24.063543 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-svxbf" event={"ID":"791723cb-ce23-48fc-b941-ae01909dc4a4","Type":"ContainerDied","Data":"f94116fc3f08a21c17ccaad0950a285bf4b9d0c575cb327eeb29c2d4fda74417"} Sep 30 19:04:24 crc kubenswrapper[4747]: I0930 19:04:24.063570 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-svxbf" event={"ID":"791723cb-ce23-48fc-b941-ae01909dc4a4","Type":"ContainerStarted","Data":"b8ff6e012ccaac916c371f7bfc78ed0926a08a887d9c83d97cc257022e3e62a8"} Sep 30 19:04:24 crc kubenswrapper[4747]: I0930 19:04:24.065337 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w6p75" event={"ID":"19b34fa2-4ead-4998-a29d-a29d16dc9aea","Type":"ContainerStarted","Data":"e494b062a64e2b824c13df54905f2a627c01db8e37c5c9f2817c4da5bfd660e8"} Sep 30 19:04:26 crc kubenswrapper[4747]: I0930 19:04:26.088059 4747 generic.go:334] "Generic (PLEG): container finished" podID="10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" containerID="44e389b786a338dc68ff82df392cf6560c31ee2c8bc2946f421aca101f5b29ce" exitCode=0 Sep 30 19:04:26 crc kubenswrapper[4747]: I0930 19:04:26.088201 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnvtm" event={"ID":"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154","Type":"ContainerDied","Data":"44e389b786a338dc68ff82df392cf6560c31ee2c8bc2946f421aca101f5b29ce"} Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.629510 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bhzpm" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.705688 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcj6c\" (UniqueName: \"kubernetes.io/projected/bdc3a0c2-38a2-41a1-90d1-992ccba7d54f-kube-api-access-gcj6c\") pod \"bdc3a0c2-38a2-41a1-90d1-992ccba7d54f\" (UID: \"bdc3a0c2-38a2-41a1-90d1-992ccba7d54f\") " Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.710362 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc3a0c2-38a2-41a1-90d1-992ccba7d54f-kube-api-access-gcj6c" (OuterVolumeSpecName: "kube-api-access-gcj6c") pod "bdc3a0c2-38a2-41a1-90d1-992ccba7d54f" (UID: "bdc3a0c2-38a2-41a1-90d1-992ccba7d54f"). InnerVolumeSpecName "kube-api-access-gcj6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.713566 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.779347 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-svxbf" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.807953 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-combined-ca-bundle\") pod \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.808031 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2tnw\" (UniqueName: \"kubernetes.io/projected/791723cb-ce23-48fc-b941-ae01909dc4a4-kube-api-access-d2tnw\") pod \"791723cb-ce23-48fc-b941-ae01909dc4a4\" (UID: \"791723cb-ce23-48fc-b941-ae01909dc4a4\") " Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.808097 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-db-sync-config-data\") pod \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.808168 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcwdm\" (UniqueName: \"kubernetes.io/projected/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-kube-api-access-vcwdm\") pod \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.808203 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-config-data\") pod \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\" (UID: \"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154\") " Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.808969 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcj6c\" (UniqueName: \"kubernetes.io/projected/bdc3a0c2-38a2-41a1-90d1-992ccba7d54f-kube-api-access-gcj6c\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.812229 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/791723cb-ce23-48fc-b941-ae01909dc4a4-kube-api-access-d2tnw" (OuterVolumeSpecName: "kube-api-access-d2tnw") pod "791723cb-ce23-48fc-b941-ae01909dc4a4" (UID: "791723cb-ce23-48fc-b941-ae01909dc4a4"). InnerVolumeSpecName "kube-api-access-d2tnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.812323 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-kube-api-access-vcwdm" (OuterVolumeSpecName: "kube-api-access-vcwdm") pod "10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" (UID: "10f3e5f1-0b03-499d-aa2e-1efa9d8c2154"). InnerVolumeSpecName "kube-api-access-vcwdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.813723 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" (UID: "10f3e5f1-0b03-499d-aa2e-1efa9d8c2154"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.855393 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" (UID: "10f3e5f1-0b03-499d-aa2e-1efa9d8c2154"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.872910 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-config-data" (OuterVolumeSpecName: "config-data") pod "10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" (UID: "10f3e5f1-0b03-499d-aa2e-1efa9d8c2154"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.911050 4747 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.911104 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcwdm\" (UniqueName: \"kubernetes.io/projected/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-kube-api-access-vcwdm\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.911128 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.911147 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:28 crc kubenswrapper[4747]: I0930 19:04:28.911165 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2tnw\" (UniqueName: \"kubernetes.io/projected/791723cb-ce23-48fc-b941-ae01909dc4a4-kube-api-access-d2tnw\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.125667 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w6p75" event={"ID":"19b34fa2-4ead-4998-a29d-a29d16dc9aea","Type":"ContainerStarted","Data":"624e3b52353e8b7d95c206eb9c83eac38ee7ad1e796141ad957a0ffea31cd384"} Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.135178 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnvtm" event={"ID":"10f3e5f1-0b03-499d-aa2e-1efa9d8c2154","Type":"ContainerDied","Data":"bd361ad2382799ab709c2f4135e4834e4c0b9cc63394d2e33c244632d2525f37"} Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.135213 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd361ad2382799ab709c2f4135e4834e4c0b9cc63394d2e33c244632d2525f37" Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.135259 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnvtm" Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.151719 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bhzpm" Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.152221 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bhzpm" event={"ID":"bdc3a0c2-38a2-41a1-90d1-992ccba7d54f","Type":"ContainerDied","Data":"37f4239362132b6051fbfbd61c074cf0ff42675000334a140220061182b3f0f2"} Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.152544 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37f4239362132b6051fbfbd61c074cf0ff42675000334a140220061182b3f0f2" Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.155769 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-svxbf" event={"ID":"791723cb-ce23-48fc-b941-ae01909dc4a4","Type":"ContainerDied","Data":"b8ff6e012ccaac916c371f7bfc78ed0926a08a887d9c83d97cc257022e3e62a8"} Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.155810 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8ff6e012ccaac916c371f7bfc78ed0926a08a887d9c83d97cc257022e3e62a8" Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.155894 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-svxbf" Sep 30 19:04:29 crc kubenswrapper[4747]: I0930 19:04:29.166511 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-w6p75" podStartSLOduration=2.099257787 podStartE2EDuration="7.166480641s" podCreationTimestamp="2025-09-30 19:04:22 +0000 UTC" firstStartedPulling="2025-09-30 19:04:23.551213283 +0000 UTC m=+1103.210693397" lastFinishedPulling="2025-09-30 19:04:28.618436097 +0000 UTC m=+1108.277916251" observedRunningTime="2025-09-30 19:04:29.151493471 +0000 UTC m=+1108.810973605" watchObservedRunningTime="2025-09-30 19:04:29.166480641 +0000 UTC m=+1108.825960785" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.104206 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-58x64"] Sep 30 19:04:30 crc kubenswrapper[4747]: E0930 19:04:30.104837 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791723cb-ce23-48fc-b941-ae01909dc4a4" containerName="mariadb-database-create" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.104850 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="791723cb-ce23-48fc-b941-ae01909dc4a4" containerName="mariadb-database-create" Sep 30 19:04:30 crc kubenswrapper[4747]: E0930 19:04:30.104865 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" containerName="glance-db-sync" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.104871 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" containerName="glance-db-sync" Sep 30 19:04:30 crc kubenswrapper[4747]: E0930 19:04:30.104919 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc3a0c2-38a2-41a1-90d1-992ccba7d54f" containerName="mariadb-database-create" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.104950 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc3a0c2-38a2-41a1-90d1-992ccba7d54f" containerName="mariadb-database-create" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.105420 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc3a0c2-38a2-41a1-90d1-992ccba7d54f" containerName="mariadb-database-create" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.105435 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="791723cb-ce23-48fc-b941-ae01909dc4a4" containerName="mariadb-database-create" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.105472 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" containerName="glance-db-sync" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.106265 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.116043 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-58x64"] Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.147585 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-config\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.147627 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.147758 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-dns-svc\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.147801 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zj4v\" (UniqueName: \"kubernetes.io/projected/368862da-f0e8-461b-8fa3-201e3ff3d43f-kube-api-access-5zj4v\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.147849 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.249948 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.250086 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-config\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.250131 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.250284 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-dns-svc\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.250395 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zj4v\" (UniqueName: \"kubernetes.io/projected/368862da-f0e8-461b-8fa3-201e3ff3d43f-kube-api-access-5zj4v\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.251065 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.251304 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-dns-svc\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.251407 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-config\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.251950 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.273096 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zj4v\" (UniqueName: \"kubernetes.io/projected/368862da-f0e8-461b-8fa3-201e3ff3d43f-kube-api-access-5zj4v\") pod \"dnsmasq-dns-554567b4f7-58x64\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.425708 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:30 crc kubenswrapper[4747]: I0930 19:04:30.781866 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-58x64"] Sep 30 19:04:31 crc kubenswrapper[4747]: I0930 19:04:31.173080 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-58x64" event={"ID":"368862da-f0e8-461b-8fa3-201e3ff3d43f","Type":"ContainerStarted","Data":"0b767b99473b3c4facad39e986989fcd5173c43d18c456ab5ac6eec73d972dad"} Sep 30 19:04:32 crc kubenswrapper[4747]: I0930 19:04:32.188020 4747 generic.go:334] "Generic (PLEG): container finished" podID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerID="83d76a990af36ed38f5500178f8c016a0c27b13c898836eaa3327ecd0a2f0e3a" exitCode=0 Sep 30 19:04:32 crc kubenswrapper[4747]: I0930 19:04:32.188102 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-58x64" event={"ID":"368862da-f0e8-461b-8fa3-201e3ff3d43f","Type":"ContainerDied","Data":"83d76a990af36ed38f5500178f8c016a0c27b13c898836eaa3327ecd0a2f0e3a"} Sep 30 19:04:33 crc kubenswrapper[4747]: I0930 19:04:33.203728 4747 generic.go:334] "Generic (PLEG): container finished" podID="19b34fa2-4ead-4998-a29d-a29d16dc9aea" containerID="624e3b52353e8b7d95c206eb9c83eac38ee7ad1e796141ad957a0ffea31cd384" exitCode=0 Sep 30 19:04:33 crc kubenswrapper[4747]: I0930 19:04:33.203836 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w6p75" event={"ID":"19b34fa2-4ead-4998-a29d-a29d16dc9aea","Type":"ContainerDied","Data":"624e3b52353e8b7d95c206eb9c83eac38ee7ad1e796141ad957a0ffea31cd384"} Sep 30 19:04:33 crc kubenswrapper[4747]: I0930 19:04:33.209499 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-58x64" event={"ID":"368862da-f0e8-461b-8fa3-201e3ff3d43f","Type":"ContainerStarted","Data":"54e3378750f731e224106ad1b5b955ba3160ff8c66f18abb57f22239769807a3"} Sep 30 19:04:33 crc kubenswrapper[4747]: I0930 19:04:33.209708 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:33 crc kubenswrapper[4747]: I0930 19:04:33.270613 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-554567b4f7-58x64" podStartSLOduration=3.2705880560000002 podStartE2EDuration="3.270588056s" podCreationTimestamp="2025-09-30 19:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:33.266731535 +0000 UTC m=+1112.926211689" watchObservedRunningTime="2025-09-30 19:04:33.270588056 +0000 UTC m=+1112.930068210" Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.754391 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.824584 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-combined-ca-bundle\") pod \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.825290 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-config-data\") pod \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.825404 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf29g\" (UniqueName: \"kubernetes.io/projected/19b34fa2-4ead-4998-a29d-a29d16dc9aea-kube-api-access-gf29g\") pod \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\" (UID: \"19b34fa2-4ead-4998-a29d-a29d16dc9aea\") " Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.832383 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b34fa2-4ead-4998-a29d-a29d16dc9aea-kube-api-access-gf29g" (OuterVolumeSpecName: "kube-api-access-gf29g") pod "19b34fa2-4ead-4998-a29d-a29d16dc9aea" (UID: "19b34fa2-4ead-4998-a29d-a29d16dc9aea"). InnerVolumeSpecName "kube-api-access-gf29g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.860805 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19b34fa2-4ead-4998-a29d-a29d16dc9aea" (UID: "19b34fa2-4ead-4998-a29d-a29d16dc9aea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.895344 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-config-data" (OuterVolumeSpecName: "config-data") pod "19b34fa2-4ead-4998-a29d-a29d16dc9aea" (UID: "19b34fa2-4ead-4998-a29d-a29d16dc9aea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.927851 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.928176 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19b34fa2-4ead-4998-a29d-a29d16dc9aea-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:34 crc kubenswrapper[4747]: I0930 19:04:34.928349 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf29g\" (UniqueName: \"kubernetes.io/projected/19b34fa2-4ead-4998-a29d-a29d16dc9aea-kube-api-access-gf29g\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.231449 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w6p75" event={"ID":"19b34fa2-4ead-4998-a29d-a29d16dc9aea","Type":"ContainerDied","Data":"e494b062a64e2b824c13df54905f2a627c01db8e37c5c9f2817c4da5bfd660e8"} Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.231499 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e494b062a64e2b824c13df54905f2a627c01db8e37c5c9f2817c4da5bfd660e8" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.231503 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w6p75" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.532896 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5cdw6"] Sep 30 19:04:35 crc kubenswrapper[4747]: E0930 19:04:35.533350 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b34fa2-4ead-4998-a29d-a29d16dc9aea" containerName="keystone-db-sync" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.533373 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b34fa2-4ead-4998-a29d-a29d16dc9aea" containerName="keystone-db-sync" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.533563 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b34fa2-4ead-4998-a29d-a29d16dc9aea" containerName="keystone-db-sync" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.534243 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.537121 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.537156 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q4ln8" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.541970 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-58x64"] Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.542220 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-554567b4f7-58x64" podUID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerName="dnsmasq-dns" containerID="cri-o://54e3378750f731e224106ad1b5b955ba3160ff8c66f18abb57f22239769807a3" gracePeriod=10 Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.543439 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.545677 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.578382 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67795cd9-x9gmx"] Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.585965 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647243 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647294 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647316 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-combined-ca-bundle\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647336 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-dns-svc\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647358 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-scripts\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647385 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-config-data\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647419 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-fernet-keys\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647456 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8p4\" (UniqueName: \"kubernetes.io/projected/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-kube-api-access-nf8p4\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647476 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-config\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647494 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbw24\" (UniqueName: \"kubernetes.io/projected/46fece27-faee-4f8e-a347-c15ae7c48426-kube-api-access-gbw24\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.647534 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-credential-keys\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.651703 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-x9gmx"] Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.658252 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5cdw6"] Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.748764 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-credential-keys\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.749370 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.749477 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.749548 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-combined-ca-bundle\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.749627 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-dns-svc\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.749705 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-scripts\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.749788 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-config-data\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.749877 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-fernet-keys\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.750048 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8p4\" (UniqueName: \"kubernetes.io/projected/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-kube-api-access-nf8p4\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.750131 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-config\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.750208 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbw24\" (UniqueName: \"kubernetes.io/projected/46fece27-faee-4f8e-a347-c15ae7c48426-kube-api-access-gbw24\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.750222 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.750520 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.751232 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-config\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.752421 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-dns-svc\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.753655 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-combined-ca-bundle\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.753988 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-credential-keys\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.757777 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-scripts\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.761719 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-config-data\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.762844 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-fernet-keys\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.773556 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8p4\" (UniqueName: \"kubernetes.io/projected/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-kube-api-access-nf8p4\") pod \"dnsmasq-dns-67795cd9-x9gmx\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.792560 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbw24\" (UniqueName: \"kubernetes.io/projected/46fece27-faee-4f8e-a347-c15ae7c48426-kube-api-access-gbw24\") pod \"keystone-bootstrap-5cdw6\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.866833 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-h5f4l"] Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.867801 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.870008 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.870154 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2jdzz" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.870295 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.875558 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.919066 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h5f4l"] Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.921406 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.950605 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-x9gmx"] Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.958685 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-combined-ca-bundle\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.958756 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-scripts\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.958793 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddnrg\" (UniqueName: \"kubernetes.io/projected/2fcd68f8-070b-4361-841f-acea0b80118f-kube-api-access-ddnrg\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.958810 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-config-data\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:35 crc kubenswrapper[4747]: I0930 19:04:35.958845 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fcd68f8-070b-4361-841f-acea0b80118f-logs\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.015476 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-flzbs"] Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.023494 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.024575 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-flzbs"] Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060464 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddnrg\" (UniqueName: \"kubernetes.io/projected/2fcd68f8-070b-4361-841f-acea0b80118f-kube-api-access-ddnrg\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060505 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-config-data\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060554 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fcd68f8-070b-4361-841f-acea0b80118f-logs\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060581 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vmlc\" (UniqueName: \"kubernetes.io/projected/585e57fe-da27-46d4-9b1d-d9242b17081c-kube-api-access-5vmlc\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060604 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-config\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060640 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060694 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060718 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-combined-ca-bundle\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060744 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.060768 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-scripts\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.061329 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fcd68f8-070b-4361-841f-acea0b80118f-logs\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.066675 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-combined-ca-bundle\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.067343 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-scripts\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.067675 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-config-data\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.086729 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddnrg\" (UniqueName: \"kubernetes.io/projected/2fcd68f8-070b-4361-841f-acea0b80118f-kube-api-access-ddnrg\") pod \"placement-db-sync-h5f4l\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.162117 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vmlc\" (UniqueName: \"kubernetes.io/projected/585e57fe-da27-46d4-9b1d-d9242b17081c-kube-api-access-5vmlc\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.162156 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-config\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.162192 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.162243 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.162275 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.163258 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.164003 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-config\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.164609 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.166173 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.182746 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vmlc\" (UniqueName: \"kubernetes.io/projected/585e57fe-da27-46d4-9b1d-d9242b17081c-kube-api-access-5vmlc\") pod \"dnsmasq-dns-5b6dbdb6f5-flzbs\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.194044 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.242907 4747 generic.go:334] "Generic (PLEG): container finished" podID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerID="54e3378750f731e224106ad1b5b955ba3160ff8c66f18abb57f22239769807a3" exitCode=0 Sep 30 19:04:36 crc kubenswrapper[4747]: I0930 19:04:36.242972 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-58x64" event={"ID":"368862da-f0e8-461b-8fa3-201e3ff3d43f","Type":"ContainerDied","Data":"54e3378750f731e224106ad1b5b955ba3160ff8c66f18abb57f22239769807a3"} Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.360243 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.374670 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5cdw6"] Sep 30 19:04:39 crc kubenswrapper[4747]: W0930 19:04:36.384160 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46fece27_faee_4f8e_a347_c15ae7c48426.slice/crio-b3ee133dfee9244b6b2aa9fecc7a9124ec80f8c8790f5fdbad525e71208b7b4b WatchSource:0}: Error finding container b3ee133dfee9244b6b2aa9fecc7a9124ec80f8c8790f5fdbad525e71208b7b4b: Status 404 returned error can't find the container with id b3ee133dfee9244b6b2aa9fecc7a9124ec80f8c8790f5fdbad525e71208b7b4b Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.498858 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-x9gmx"] Sep 30 19:04:39 crc kubenswrapper[4747]: W0930 19:04:36.508691 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9d4e09f_7606_445f_9e34_cbb0b675a7f5.slice/crio-ad269026492a83ee42041c798fcb2ebb3f8a9a2c34e8bc01529c42f79b728788 WatchSource:0}: Error finding container ad269026492a83ee42041c798fcb2ebb3f8a9a2c34e8bc01529c42f79b728788: Status 404 returned error can't find the container with id ad269026492a83ee42041c798fcb2ebb3f8a9a2c34e8bc01529c42f79b728788 Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.637995 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.639717 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.646228 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.646474 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.647706 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wm2bc" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.650235 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.657257 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.716127 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.727966 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.732568 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.736309 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.737462 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.775020 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.775062 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlvrn\" (UniqueName: \"kubernetes.io/projected/bee42841-64d1-46a9-ab0d-bacbfb82adfe-kube-api-access-jlvrn\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.775095 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.775139 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.775293 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-scripts\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.775363 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.775413 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-logs\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.775449 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-config-data\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877309 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877354 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-logs\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877376 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlvrn\" (UniqueName: \"kubernetes.io/projected/bee42841-64d1-46a9-ab0d-bacbfb82adfe-kube-api-access-jlvrn\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877397 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877417 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877474 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877501 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877561 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877613 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-scripts\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877638 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877660 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877686 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877815 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-logs\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877714 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877843 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-config-data\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877864 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhdt\" (UniqueName: \"kubernetes.io/projected/5451328f-0679-4ef6-9b22-503d6c4f0a67-kube-api-access-bjhdt\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.877882 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.878420 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.878668 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-logs\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.884307 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-config-data\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.884480 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-scripts\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.885306 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.888797 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.898299 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlvrn\" (UniqueName: \"kubernetes.io/projected/bee42841-64d1-46a9-ab0d-bacbfb82adfe-kube-api-access-jlvrn\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.911001 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979112 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979169 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979226 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979261 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979333 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhdt\" (UniqueName: \"kubernetes.io/projected/5451328f-0679-4ef6-9b22-503d6c4f0a67-kube-api-access-bjhdt\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979356 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979421 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-logs\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979450 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.979849 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.980057 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.982050 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-logs\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.983812 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.985066 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.985550 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:36.997018 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.001658 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhdt\" (UniqueName: \"kubernetes.io/projected/5451328f-0679-4ef6-9b22-503d6c4f0a67-kube-api-access-bjhdt\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.006275 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.057643 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.068282 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.253335 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-x9gmx" event={"ID":"f9d4e09f-7606-445f-9e34-cbb0b675a7f5","Type":"ContainerStarted","Data":"ad269026492a83ee42041c798fcb2ebb3f8a9a2c34e8bc01529c42f79b728788"} Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.255696 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5cdw6" event={"ID":"46fece27-faee-4f8e-a347-c15ae7c48426","Type":"ContainerStarted","Data":"b3ee133dfee9244b6b2aa9fecc7a9124ec80f8c8790f5fdbad525e71208b7b4b"} Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.655732 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.656008 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.656049 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.656753 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d1099e7b2e4398f262beea3d468545e7d66ad71310f8a69af5aae38ae2c6601"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.656806 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://2d1099e7b2e4398f262beea3d468545e7d66ad71310f8a69af5aae38ae2c6601" gracePeriod=600 Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.806797 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:37.873067 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:38.270570 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="2d1099e7b2e4398f262beea3d468545e7d66ad71310f8a69af5aae38ae2c6601" exitCode=0 Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:38.270627 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"2d1099e7b2e4398f262beea3d468545e7d66ad71310f8a69af5aae38ae2c6601"} Sep 30 19:04:39 crc kubenswrapper[4747]: I0930 19:04:38.270685 4747 scope.go:117] "RemoveContainer" containerID="9324a2c247fa6850748cdd90467f095bd666e1119af1ca69c9f4d4385e9867bb" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.293739 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5cdw6" event={"ID":"46fece27-faee-4f8e-a347-c15ae7c48426","Type":"ContainerStarted","Data":"a26df6585ef0623801339b8dcc91545c365b3eda853776c4f0c687c5fd7b10ec"} Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.332883 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"c119b3f4e3827265fb0e759c838f87c36311898c6328de90654d566acfc99097"} Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.354367 4747 generic.go:334] "Generic (PLEG): container finished" podID="f9d4e09f-7606-445f-9e34-cbb0b675a7f5" containerID="a47e7d86c8cb70138bbcd171fe7eed2ef5a4fbe4471525b64b06c2faaa47907a" exitCode=0 Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.354421 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-x9gmx" event={"ID":"f9d4e09f-7606-445f-9e34-cbb0b675a7f5","Type":"ContainerDied","Data":"a47e7d86c8cb70138bbcd171fe7eed2ef5a4fbe4471525b64b06c2faaa47907a"} Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.363355 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5cdw6" podStartSLOduration=5.363335581 podStartE2EDuration="5.363335581s" podCreationTimestamp="2025-09-30 19:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:40.330947961 +0000 UTC m=+1119.990428075" watchObservedRunningTime="2025-09-30 19:04:40.363335581 +0000 UTC m=+1120.022815695" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.534001 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-flzbs"] Sep 30 19:04:40 crc kubenswrapper[4747]: W0930 19:04:40.550684 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod585e57fe_da27_46d4_9b1d_d9242b17081c.slice/crio-1aad4ff16ffbb3ef4e3543bc9b1f362c43f36616fc6547da005b67296b45d03d WatchSource:0}: Error finding container 1aad4ff16ffbb3ef4e3543bc9b1f362c43f36616fc6547da005b67296b45d03d: Status 404 returned error can't find the container with id 1aad4ff16ffbb3ef4e3543bc9b1f362c43f36616fc6547da005b67296b45d03d Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.554415 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h5f4l"] Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.643068 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.698440 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.777442 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-nb\") pod \"368862da-f0e8-461b-8fa3-201e3ff3d43f\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.777682 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-dns-svc\") pod \"368862da-f0e8-461b-8fa3-201e3ff3d43f\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.777704 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zj4v\" (UniqueName: \"kubernetes.io/projected/368862da-f0e8-461b-8fa3-201e3ff3d43f-kube-api-access-5zj4v\") pod \"368862da-f0e8-461b-8fa3-201e3ff3d43f\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.777740 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-config\") pod \"368862da-f0e8-461b-8fa3-201e3ff3d43f\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.777764 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-sb\") pod \"368862da-f0e8-461b-8fa3-201e3ff3d43f\" (UID: \"368862da-f0e8-461b-8fa3-201e3ff3d43f\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.784188 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368862da-f0e8-461b-8fa3-201e3ff3d43f-kube-api-access-5zj4v" (OuterVolumeSpecName: "kube-api-access-5zj4v") pod "368862da-f0e8-461b-8fa3-201e3ff3d43f" (UID: "368862da-f0e8-461b-8fa3-201e3ff3d43f"). InnerVolumeSpecName "kube-api-access-5zj4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.788663 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.831226 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "368862da-f0e8-461b-8fa3-201e3ff3d43f" (UID: "368862da-f0e8-461b-8fa3-201e3ff3d43f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.836636 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-config" (OuterVolumeSpecName: "config") pod "368862da-f0e8-461b-8fa3-201e3ff3d43f" (UID: "368862da-f0e8-461b-8fa3-201e3ff3d43f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.839732 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "368862da-f0e8-461b-8fa3-201e3ff3d43f" (UID: "368862da-f0e8-461b-8fa3-201e3ff3d43f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.848375 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "368862da-f0e8-461b-8fa3-201e3ff3d43f" (UID: "368862da-f0e8-461b-8fa3-201e3ff3d43f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.879891 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.879937 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.879948 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zj4v\" (UniqueName: \"kubernetes.io/projected/368862da-f0e8-461b-8fa3-201e3ff3d43f-kube-api-access-5zj4v\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.879958 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.879967 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368862da-f0e8-461b-8fa3-201e3ff3d43f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.980995 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-dns-svc\") pod \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.981084 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-sb\") pod \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.981166 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-config\") pod \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.981225 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-nb\") pod \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.981297 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf8p4\" (UniqueName: \"kubernetes.io/projected/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-kube-api-access-nf8p4\") pod \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\" (UID: \"f9d4e09f-7606-445f-9e34-cbb0b675a7f5\") " Sep 30 19:04:40 crc kubenswrapper[4747]: I0930 19:04:40.996073 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-kube-api-access-nf8p4" (OuterVolumeSpecName: "kube-api-access-nf8p4") pod "f9d4e09f-7606-445f-9e34-cbb0b675a7f5" (UID: "f9d4e09f-7606-445f-9e34-cbb0b675a7f5"). InnerVolumeSpecName "kube-api-access-nf8p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.000691 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9d4e09f-7606-445f-9e34-cbb0b675a7f5" (UID: "f9d4e09f-7606-445f-9e34-cbb0b675a7f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.000782 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9d4e09f-7606-445f-9e34-cbb0b675a7f5" (UID: "f9d4e09f-7606-445f-9e34-cbb0b675a7f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.001869 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9d4e09f-7606-445f-9e34-cbb0b675a7f5" (UID: "f9d4e09f-7606-445f-9e34-cbb0b675a7f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.002642 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-config" (OuterVolumeSpecName: "config") pod "f9d4e09f-7606-445f-9e34-cbb0b675a7f5" (UID: "f9d4e09f-7606-445f-9e34-cbb0b675a7f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.083009 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.083043 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.083063 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.083071 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.083080 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf8p4\" (UniqueName: \"kubernetes.io/projected/f9d4e09f-7606-445f-9e34-cbb0b675a7f5-kube-api-access-nf8p4\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.371460 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5451328f-0679-4ef6-9b22-503d6c4f0a67","Type":"ContainerStarted","Data":"68dfdc3da9a3171ad878db2cbf1c3eb5ca1d92c5c9ce80967a41f2b70c81b74d"} Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.371744 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5451328f-0679-4ef6-9b22-503d6c4f0a67","Type":"ContainerStarted","Data":"52473ac44dcb0c6d3be95aac23a28f05cbe6f06be6392b272178811f13c473b5"} Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.374347 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h5f4l" event={"ID":"2fcd68f8-070b-4361-841f-acea0b80118f","Type":"ContainerStarted","Data":"3300aedc0dd6f01bbf3d5f1f6a2d7e40254a05388c7543a7bbf85a359be82c2b"} Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.377666 4747 generic.go:334] "Generic (PLEG): container finished" podID="585e57fe-da27-46d4-9b1d-d9242b17081c" containerID="383fe629a07f286528ea512fa3ba5c08b182a8d388e10d26a5aabb548c9c4280" exitCode=0 Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.377853 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" event={"ID":"585e57fe-da27-46d4-9b1d-d9242b17081c","Type":"ContainerDied","Data":"383fe629a07f286528ea512fa3ba5c08b182a8d388e10d26a5aabb548c9c4280"} Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.377886 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" event={"ID":"585e57fe-da27-46d4-9b1d-d9242b17081c","Type":"ContainerStarted","Data":"1aad4ff16ffbb3ef4e3543bc9b1f362c43f36616fc6547da005b67296b45d03d"} Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.385586 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-x9gmx" event={"ID":"f9d4e09f-7606-445f-9e34-cbb0b675a7f5","Type":"ContainerDied","Data":"ad269026492a83ee42041c798fcb2ebb3f8a9a2c34e8bc01529c42f79b728788"} Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.385633 4747 scope.go:117] "RemoveContainer" containerID="a47e7d86c8cb70138bbcd171fe7eed2ef5a4fbe4471525b64b06c2faaa47907a" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.385753 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-x9gmx" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.388501 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-58x64" event={"ID":"368862da-f0e8-461b-8fa3-201e3ff3d43f","Type":"ContainerDied","Data":"0b767b99473b3c4facad39e986989fcd5173c43d18c456ab5ac6eec73d972dad"} Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.388715 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-58x64" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.413472 4747 scope.go:117] "RemoveContainer" containerID="54e3378750f731e224106ad1b5b955ba3160ff8c66f18abb57f22239769807a3" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.434258 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-58x64"] Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.444047 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-58x64"] Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.475995 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-x9gmx"] Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.488616 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-x9gmx"] Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.526517 4747 scope.go:117] "RemoveContainer" containerID="83d76a990af36ed38f5500178f8c016a0c27b13c898836eaa3327ecd0a2f0e3a" Sep 30 19:04:41 crc kubenswrapper[4747]: I0930 19:04:41.731001 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:41 crc kubenswrapper[4747]: W0930 19:04:41.743892 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbee42841_64d1_46a9_ab0d_bacbfb82adfe.slice/crio-cc0df99ed285bbfaa09af8947b9f5aa3d08a2529d37a0e6a0f0c991b214d6d44 WatchSource:0}: Error finding container cc0df99ed285bbfaa09af8947b9f5aa3d08a2529d37a0e6a0f0c991b214d6d44: Status 404 returned error can't find the container with id cc0df99ed285bbfaa09af8947b9f5aa3d08a2529d37a0e6a0f0c991b214d6d44 Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.416025 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" event={"ID":"585e57fe-da27-46d4-9b1d-d9242b17081c","Type":"ContainerStarted","Data":"27dcf8ddf5ae46474c83614323b2e5c7b559306ca26994baf28acc96acbbb58c"} Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.416510 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.425223 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bee42841-64d1-46a9-ab0d-bacbfb82adfe","Type":"ContainerStarted","Data":"02819ee4daff16b63201d915d4a7edf2719ceebf62dfab8f9b9df3194d1367ab"} Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.425259 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bee42841-64d1-46a9-ab0d-bacbfb82adfe","Type":"ContainerStarted","Data":"cc0df99ed285bbfaa09af8947b9f5aa3d08a2529d37a0e6a0f0c991b214d6d44"} Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.428204 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5451328f-0679-4ef6-9b22-503d6c4f0a67","Type":"ContainerStarted","Data":"a5f379a608a7968b51fe7ff436cde791858e9b396784d3329176df76acb5aa05"} Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.428523 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerName="glance-log" containerID="cri-o://68dfdc3da9a3171ad878db2cbf1c3eb5ca1d92c5c9ce80967a41f2b70c81b74d" gracePeriod=30 Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.428791 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerName="glance-httpd" containerID="cri-o://a5f379a608a7968b51fe7ff436cde791858e9b396784d3329176df76acb5aa05" gracePeriod=30 Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.443718 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" podStartSLOduration=7.443693676 podStartE2EDuration="7.443693676s" podCreationTimestamp="2025-09-30 19:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:42.437323513 +0000 UTC m=+1122.096803637" watchObservedRunningTime="2025-09-30 19:04:42.443693676 +0000 UTC m=+1122.103173800" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.475963 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.475942391 podStartE2EDuration="7.475942391s" podCreationTimestamp="2025-09-30 19:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:42.464961546 +0000 UTC m=+1122.124441670" watchObservedRunningTime="2025-09-30 19:04:42.475942391 +0000 UTC m=+1122.135422515" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.675371 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d52e-account-create-mb42m"] Sep 30 19:04:42 crc kubenswrapper[4747]: E0930 19:04:42.675734 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerName="init" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.675754 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerName="init" Sep 30 19:04:42 crc kubenswrapper[4747]: E0930 19:04:42.675780 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerName="dnsmasq-dns" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.675791 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerName="dnsmasq-dns" Sep 30 19:04:42 crc kubenswrapper[4747]: E0930 19:04:42.675799 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d4e09f-7606-445f-9e34-cbb0b675a7f5" containerName="init" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.675807 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d4e09f-7606-445f-9e34-cbb0b675a7f5" containerName="init" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.676009 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerName="dnsmasq-dns" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.676052 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d4e09f-7606-445f-9e34-cbb0b675a7f5" containerName="init" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.676651 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d52e-account-create-mb42m" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.679665 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.682156 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d52e-account-create-mb42m"] Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.812597 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl46r\" (UniqueName: \"kubernetes.io/projected/2d86156e-2bee-4e6e-a152-5f2b92d0ed6e-kube-api-access-rl46r\") pod \"cinder-d52e-account-create-mb42m\" (UID: \"2d86156e-2bee-4e6e-a152-5f2b92d0ed6e\") " pod="openstack/cinder-d52e-account-create-mb42m" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.871726 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-665b-account-create-rsr8w"] Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.872809 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-665b-account-create-rsr8w" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.875552 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.887002 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-665b-account-create-rsr8w"] Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.913832 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl46r\" (UniqueName: \"kubernetes.io/projected/2d86156e-2bee-4e6e-a152-5f2b92d0ed6e-kube-api-access-rl46r\") pod \"cinder-d52e-account-create-mb42m\" (UID: \"2d86156e-2bee-4e6e-a152-5f2b92d0ed6e\") " pod="openstack/cinder-d52e-account-create-mb42m" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.941028 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl46r\" (UniqueName: \"kubernetes.io/projected/2d86156e-2bee-4e6e-a152-5f2b92d0ed6e-kube-api-access-rl46r\") pod \"cinder-d52e-account-create-mb42m\" (UID: \"2d86156e-2bee-4e6e-a152-5f2b92d0ed6e\") " pod="openstack/cinder-d52e-account-create-mb42m" Sep 30 19:04:42 crc kubenswrapper[4747]: I0930 19:04:42.999216 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d52e-account-create-mb42m" Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.016134 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf9zd\" (UniqueName: \"kubernetes.io/projected/fd91c92e-badb-4682-b203-7889aeeef868-kube-api-access-lf9zd\") pod \"neutron-665b-account-create-rsr8w\" (UID: \"fd91c92e-badb-4682-b203-7889aeeef868\") " pod="openstack/neutron-665b-account-create-rsr8w" Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.097408 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368862da-f0e8-461b-8fa3-201e3ff3d43f" path="/var/lib/kubelet/pods/368862da-f0e8-461b-8fa3-201e3ff3d43f/volumes" Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.097995 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d4e09f-7606-445f-9e34-cbb0b675a7f5" path="/var/lib/kubelet/pods/f9d4e09f-7606-445f-9e34-cbb0b675a7f5/volumes" Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.118156 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf9zd\" (UniqueName: \"kubernetes.io/projected/fd91c92e-badb-4682-b203-7889aeeef868-kube-api-access-lf9zd\") pod \"neutron-665b-account-create-rsr8w\" (UID: \"fd91c92e-badb-4682-b203-7889aeeef868\") " pod="openstack/neutron-665b-account-create-rsr8w" Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.147066 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf9zd\" (UniqueName: \"kubernetes.io/projected/fd91c92e-badb-4682-b203-7889aeeef868-kube-api-access-lf9zd\") pod \"neutron-665b-account-create-rsr8w\" (UID: \"fd91c92e-badb-4682-b203-7889aeeef868\") " pod="openstack/neutron-665b-account-create-rsr8w" Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.193995 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-665b-account-create-rsr8w" Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.436972 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bee42841-64d1-46a9-ab0d-bacbfb82adfe","Type":"ContainerStarted","Data":"a2a75eb81fc3e3dea64915dd213d23c5432db56f4b345bfe6d22247f76b457cd"} Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.437069 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerName="glance-log" containerID="cri-o://02819ee4daff16b63201d915d4a7edf2719ceebf62dfab8f9b9df3194d1367ab" gracePeriod=30 Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.437174 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerName="glance-httpd" containerID="cri-o://a2a75eb81fc3e3dea64915dd213d23c5432db56f4b345bfe6d22247f76b457cd" gracePeriod=30 Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.438804 4747 generic.go:334] "Generic (PLEG): container finished" podID="46fece27-faee-4f8e-a347-c15ae7c48426" containerID="a26df6585ef0623801339b8dcc91545c365b3eda853776c4f0c687c5fd7b10ec" exitCode=0 Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.438872 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5cdw6" event={"ID":"46fece27-faee-4f8e-a347-c15ae7c48426","Type":"ContainerDied","Data":"a26df6585ef0623801339b8dcc91545c365b3eda853776c4f0c687c5fd7b10ec"} Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.441003 4747 generic.go:334] "Generic (PLEG): container finished" podID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerID="a5f379a608a7968b51fe7ff436cde791858e9b396784d3329176df76acb5aa05" exitCode=0 Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.441025 4747 generic.go:334] "Generic (PLEG): container finished" podID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerID="68dfdc3da9a3171ad878db2cbf1c3eb5ca1d92c5c9ce80967a41f2b70c81b74d" exitCode=143 Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.441080 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5451328f-0679-4ef6-9b22-503d6c4f0a67","Type":"ContainerDied","Data":"a5f379a608a7968b51fe7ff436cde791858e9b396784d3329176df76acb5aa05"} Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.441104 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5451328f-0679-4ef6-9b22-503d6c4f0a67","Type":"ContainerDied","Data":"68dfdc3da9a3171ad878db2cbf1c3eb5ca1d92c5c9ce80967a41f2b70c81b74d"} Sep 30 19:04:43 crc kubenswrapper[4747]: I0930 19:04:43.462852 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.462828894 podStartE2EDuration="8.462828894s" podCreationTimestamp="2025-09-30 19:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:43.454744122 +0000 UTC m=+1123.114224256" watchObservedRunningTime="2025-09-30 19:04:43.462828894 +0000 UTC m=+1123.122309018" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.457190 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h5f4l" event={"ID":"2fcd68f8-070b-4361-841f-acea0b80118f","Type":"ContainerStarted","Data":"62a1c03b04ebeca827ef9b623107b09329c40d008cc6c30a0f7baf1ecc74b7fc"} Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.462137 4747 generic.go:334] "Generic (PLEG): container finished" podID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerID="a2a75eb81fc3e3dea64915dd213d23c5432db56f4b345bfe6d22247f76b457cd" exitCode=0 Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.462171 4747 generic.go:334] "Generic (PLEG): container finished" podID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerID="02819ee4daff16b63201d915d4a7edf2719ceebf62dfab8f9b9df3194d1367ab" exitCode=143 Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.462337 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bee42841-64d1-46a9-ab0d-bacbfb82adfe","Type":"ContainerDied","Data":"a2a75eb81fc3e3dea64915dd213d23c5432db56f4b345bfe6d22247f76b457cd"} Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.462364 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bee42841-64d1-46a9-ab0d-bacbfb82adfe","Type":"ContainerDied","Data":"02819ee4daff16b63201d915d4a7edf2719ceebf62dfab8f9b9df3194d1367ab"} Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.486057 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-h5f4l" podStartSLOduration=5.859363581 podStartE2EDuration="9.48603876s" podCreationTimestamp="2025-09-30 19:04:35 +0000 UTC" firstStartedPulling="2025-09-30 19:04:40.562375535 +0000 UTC m=+1120.221855649" lastFinishedPulling="2025-09-30 19:04:44.189050714 +0000 UTC m=+1123.848530828" observedRunningTime="2025-09-30 19:04:44.475126616 +0000 UTC m=+1124.134606750" watchObservedRunningTime="2025-09-30 19:04:44.48603876 +0000 UTC m=+1124.145518874" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.549277 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.609609 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.651759 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-combined-ca-bundle\") pod \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.651837 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-public-tls-certs\") pod \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.651955 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-httpd-run\") pod \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652028 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlvrn\" (UniqueName: \"kubernetes.io/projected/bee42841-64d1-46a9-ab0d-bacbfb82adfe-kube-api-access-jlvrn\") pod \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652359 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-scripts\") pod \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652398 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652432 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjhdt\" (UniqueName: \"kubernetes.io/projected/5451328f-0679-4ef6-9b22-503d6c4f0a67-kube-api-access-bjhdt\") pod \"5451328f-0679-4ef6-9b22-503d6c4f0a67\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652478 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-logs\") pod \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652511 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-logs\") pod \"5451328f-0679-4ef6-9b22-503d6c4f0a67\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652537 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-config-data\") pod \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\" (UID: \"bee42841-64d1-46a9-ab0d-bacbfb82adfe\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652583 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-combined-ca-bundle\") pod \"5451328f-0679-4ef6-9b22-503d6c4f0a67\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652640 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-scripts\") pod \"5451328f-0679-4ef6-9b22-503d6c4f0a67\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.652664 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"5451328f-0679-4ef6-9b22-503d6c4f0a67\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.653206 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-logs" (OuterVolumeSpecName: "logs") pod "5451328f-0679-4ef6-9b22-503d6c4f0a67" (UID: "5451328f-0679-4ef6-9b22-503d6c4f0a67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.653217 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-logs" (OuterVolumeSpecName: "logs") pod "bee42841-64d1-46a9-ab0d-bacbfb82adfe" (UID: "bee42841-64d1-46a9-ab0d-bacbfb82adfe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.653523 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bee42841-64d1-46a9-ab0d-bacbfb82adfe" (UID: "bee42841-64d1-46a9-ab0d-bacbfb82adfe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.658086 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "5451328f-0679-4ef6-9b22-503d6c4f0a67" (UID: "5451328f-0679-4ef6-9b22-503d6c4f0a67"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.658345 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-scripts" (OuterVolumeSpecName: "scripts") pod "5451328f-0679-4ef6-9b22-503d6c4f0a67" (UID: "5451328f-0679-4ef6-9b22-503d6c4f0a67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.664101 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "bee42841-64d1-46a9-ab0d-bacbfb82adfe" (UID: "bee42841-64d1-46a9-ab0d-bacbfb82adfe"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.664142 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-scripts" (OuterVolumeSpecName: "scripts") pod "bee42841-64d1-46a9-ab0d-bacbfb82adfe" (UID: "bee42841-64d1-46a9-ab0d-bacbfb82adfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.664225 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee42841-64d1-46a9-ab0d-bacbfb82adfe-kube-api-access-jlvrn" (OuterVolumeSpecName: "kube-api-access-jlvrn") pod "bee42841-64d1-46a9-ab0d-bacbfb82adfe" (UID: "bee42841-64d1-46a9-ab0d-bacbfb82adfe"). InnerVolumeSpecName "kube-api-access-jlvrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.664257 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5451328f-0679-4ef6-9b22-503d6c4f0a67-kube-api-access-bjhdt" (OuterVolumeSpecName: "kube-api-access-bjhdt") pod "5451328f-0679-4ef6-9b22-503d6c4f0a67" (UID: "5451328f-0679-4ef6-9b22-503d6c4f0a67"). InnerVolumeSpecName "kube-api-access-bjhdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.720483 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bee42841-64d1-46a9-ab0d-bacbfb82adfe" (UID: "bee42841-64d1-46a9-ab0d-bacbfb82adfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.722227 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bee42841-64d1-46a9-ab0d-bacbfb82adfe" (UID: "bee42841-64d1-46a9-ab0d-bacbfb82adfe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.724078 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5451328f-0679-4ef6-9b22-503d6c4f0a67" (UID: "5451328f-0679-4ef6-9b22-503d6c4f0a67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.759542 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-config-data\") pod \"5451328f-0679-4ef6-9b22-503d6c4f0a67\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.759597 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-httpd-run\") pod \"5451328f-0679-4ef6-9b22-503d6c4f0a67\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.759630 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-internal-tls-certs\") pod \"5451328f-0679-4ef6-9b22-503d6c4f0a67\" (UID: \"5451328f-0679-4ef6-9b22-503d6c4f0a67\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760069 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760085 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760098 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760111 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlvrn\" (UniqueName: \"kubernetes.io/projected/bee42841-64d1-46a9-ab0d-bacbfb82adfe-kube-api-access-jlvrn\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760127 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760153 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760168 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjhdt\" (UniqueName: \"kubernetes.io/projected/5451328f-0679-4ef6-9b22-503d6c4f0a67-kube-api-access-bjhdt\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760180 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee42841-64d1-46a9-ab0d-bacbfb82adfe-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760191 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760204 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760216 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.760234 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.762143 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-config-data" (OuterVolumeSpecName: "config-data") pod "bee42841-64d1-46a9-ab0d-bacbfb82adfe" (UID: "bee42841-64d1-46a9-ab0d-bacbfb82adfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.762579 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5451328f-0679-4ef6-9b22-503d6c4f0a67" (UID: "5451328f-0679-4ef6-9b22-503d6c4f0a67"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.786078 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d52e-account-create-mb42m"] Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.792555 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.805756 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-665b-account-create-rsr8w"] Sep 30 19:04:44 crc kubenswrapper[4747]: W0930 19:04:44.807041 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd91c92e_badb_4682_b203_7889aeeef868.slice/crio-e983d35eaabb23440a1a49ed3f8a657195cc4004b5a5d12366a44180319d6b64 WatchSource:0}: Error finding container e983d35eaabb23440a1a49ed3f8a657195cc4004b5a5d12366a44180319d6b64: Status 404 returned error can't find the container with id e983d35eaabb23440a1a49ed3f8a657195cc4004b5a5d12366a44180319d6b64 Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.813234 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.822316 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5451328f-0679-4ef6-9b22-503d6c4f0a67" (UID: "5451328f-0679-4ef6-9b22-503d6c4f0a67"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.824538 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-config-data" (OuterVolumeSpecName: "config-data") pod "5451328f-0679-4ef6-9b22-503d6c4f0a67" (UID: "5451328f-0679-4ef6-9b22-503d6c4f0a67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.855502 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.865890 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee42841-64d1-46a9-ab0d-bacbfb82adfe-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.866020 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.866046 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.866059 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5451328f-0679-4ef6-9b22-503d6c4f0a67-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.866101 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5451328f-0679-4ef6-9b22-503d6c4f0a67-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.866116 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.966970 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-combined-ca-bundle\") pod \"46fece27-faee-4f8e-a347-c15ae7c48426\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.967069 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-fernet-keys\") pod \"46fece27-faee-4f8e-a347-c15ae7c48426\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.967092 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-scripts\") pod \"46fece27-faee-4f8e-a347-c15ae7c48426\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.967120 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-credential-keys\") pod \"46fece27-faee-4f8e-a347-c15ae7c48426\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.967153 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbw24\" (UniqueName: \"kubernetes.io/projected/46fece27-faee-4f8e-a347-c15ae7c48426-kube-api-access-gbw24\") pod \"46fece27-faee-4f8e-a347-c15ae7c48426\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.967173 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-config-data\") pod \"46fece27-faee-4f8e-a347-c15ae7c48426\" (UID: \"46fece27-faee-4f8e-a347-c15ae7c48426\") " Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.972105 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-scripts" (OuterVolumeSpecName: "scripts") pod "46fece27-faee-4f8e-a347-c15ae7c48426" (UID: "46fece27-faee-4f8e-a347-c15ae7c48426"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.973025 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46fece27-faee-4f8e-a347-c15ae7c48426-kube-api-access-gbw24" (OuterVolumeSpecName: "kube-api-access-gbw24") pod "46fece27-faee-4f8e-a347-c15ae7c48426" (UID: "46fece27-faee-4f8e-a347-c15ae7c48426"). InnerVolumeSpecName "kube-api-access-gbw24". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.974222 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "46fece27-faee-4f8e-a347-c15ae7c48426" (UID: "46fece27-faee-4f8e-a347-c15ae7c48426"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:44 crc kubenswrapper[4747]: I0930 19:04:44.977216 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "46fece27-faee-4f8e-a347-c15ae7c48426" (UID: "46fece27-faee-4f8e-a347-c15ae7c48426"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.002591 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-config-data" (OuterVolumeSpecName: "config-data") pod "46fece27-faee-4f8e-a347-c15ae7c48426" (UID: "46fece27-faee-4f8e-a347-c15ae7c48426"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.006825 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46fece27-faee-4f8e-a347-c15ae7c48426" (UID: "46fece27-faee-4f8e-a347-c15ae7c48426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.069511 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.069561 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.069580 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.069600 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbw24\" (UniqueName: \"kubernetes.io/projected/46fece27-faee-4f8e-a347-c15ae7c48426-kube-api-access-gbw24\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.069617 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.069637 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fece27-faee-4f8e-a347-c15ae7c48426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.427331 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-58x64" podUID="368862da-f0e8-461b-8fa3-201e3ff3d43f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.481088 4747 generic.go:334] "Generic (PLEG): container finished" podID="fd91c92e-badb-4682-b203-7889aeeef868" containerID="9e1be8fff257effce4ab7759cca64b5ade8b8616626e59d082b777ec812c7847" exitCode=0 Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.481195 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665b-account-create-rsr8w" event={"ID":"fd91c92e-badb-4682-b203-7889aeeef868","Type":"ContainerDied","Data":"9e1be8fff257effce4ab7759cca64b5ade8b8616626e59d082b777ec812c7847"} Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.481227 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665b-account-create-rsr8w" event={"ID":"fd91c92e-badb-4682-b203-7889aeeef868","Type":"ContainerStarted","Data":"e983d35eaabb23440a1a49ed3f8a657195cc4004b5a5d12366a44180319d6b64"} Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.488543 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5451328f-0679-4ef6-9b22-503d6c4f0a67","Type":"ContainerDied","Data":"52473ac44dcb0c6d3be95aac23a28f05cbe6f06be6392b272178811f13c473b5"} Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.488576 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.488622 4747 scope.go:117] "RemoveContainer" containerID="a5f379a608a7968b51fe7ff436cde791858e9b396784d3329176df76acb5aa05" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.491130 4747 generic.go:334] "Generic (PLEG): container finished" podID="2d86156e-2bee-4e6e-a152-5f2b92d0ed6e" containerID="58b13995015ed034b97597e203150eb92685a2311a843c8262ce841153e6ef65" exitCode=0 Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.491183 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d52e-account-create-mb42m" event={"ID":"2d86156e-2bee-4e6e-a152-5f2b92d0ed6e","Type":"ContainerDied","Data":"58b13995015ed034b97597e203150eb92685a2311a843c8262ce841153e6ef65"} Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.491208 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d52e-account-create-mb42m" event={"ID":"2d86156e-2bee-4e6e-a152-5f2b92d0ed6e","Type":"ContainerStarted","Data":"d45c5fa246d1cce027bd7609718834d90f1b3678c5494f0e88535ac334d3c5cc"} Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.495523 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bee42841-64d1-46a9-ab0d-bacbfb82adfe","Type":"ContainerDied","Data":"cc0df99ed285bbfaa09af8947b9f5aa3d08a2529d37a0e6a0f0c991b214d6d44"} Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.495554 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.512914 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5cdw6" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.513024 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5cdw6" event={"ID":"46fece27-faee-4f8e-a347-c15ae7c48426","Type":"ContainerDied","Data":"b3ee133dfee9244b6b2aa9fecc7a9124ec80f8c8790f5fdbad525e71208b7b4b"} Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.513054 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3ee133dfee9244b6b2aa9fecc7a9124ec80f8c8790f5fdbad525e71208b7b4b" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.528652 4747 scope.go:117] "RemoveContainer" containerID="68dfdc3da9a3171ad878db2cbf1c3eb5ca1d92c5c9ce80967a41f2b70c81b74d" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.547636 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.557887 4747 scope.go:117] "RemoveContainer" containerID="a2a75eb81fc3e3dea64915dd213d23c5432db56f4b345bfe6d22247f76b457cd" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.561669 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574130 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:45 crc kubenswrapper[4747]: E0930 19:04:45.574504 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerName="glance-httpd" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574526 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerName="glance-httpd" Sep 30 19:04:45 crc kubenswrapper[4747]: E0930 19:04:45.574556 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerName="glance-log" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574564 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerName="glance-log" Sep 30 19:04:45 crc kubenswrapper[4747]: E0930 19:04:45.574576 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerName="glance-httpd" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574585 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerName="glance-httpd" Sep 30 19:04:45 crc kubenswrapper[4747]: E0930 19:04:45.574595 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fece27-faee-4f8e-a347-c15ae7c48426" containerName="keystone-bootstrap" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574602 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fece27-faee-4f8e-a347-c15ae7c48426" containerName="keystone-bootstrap" Sep 30 19:04:45 crc kubenswrapper[4747]: E0930 19:04:45.574615 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerName="glance-log" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574623 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerName="glance-log" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574811 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerName="glance-httpd" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574855 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fece27-faee-4f8e-a347-c15ae7c48426" containerName="keystone-bootstrap" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574883 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerName="glance-httpd" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574912 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" containerName="glance-log" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.574961 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" containerName="glance-log" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.576009 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.580159 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.580393 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wm2bc" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.580532 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.580677 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.590729 4747 scope.go:117] "RemoveContainer" containerID="02819ee4daff16b63201d915d4a7edf2719ceebf62dfab8f9b9df3194d1367ab" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.629991 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.638548 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.661758 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.667862 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.669678 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.671669 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.671899 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.680210 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.687481 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.687554 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.687604 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd6fz\" (UniqueName: \"kubernetes.io/projected/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-kube-api-access-nd6fz\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.687634 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.687717 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.687854 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.687892 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.687914 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5cdw6"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.688000 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.694332 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5cdw6"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.735031 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t55gv"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.740972 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.743947 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.743919 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.744090 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q4ln8" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.745184 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t55gv"] Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.745602 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789641 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd6fz\" (UniqueName: \"kubernetes.io/projected/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-kube-api-access-nd6fz\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789685 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789710 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789740 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-scripts\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789768 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-logs\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789790 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789811 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789832 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789877 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-combined-ca-bundle\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789910 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789949 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.789993 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790028 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790025 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790074 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4g98\" (UniqueName: \"kubernetes.io/projected/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-kube-api-access-d4g98\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790099 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-fernet-keys\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790184 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790206 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-config-data\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790223 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790250 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7857\" (UniqueName: \"kubernetes.io/projected/922b2387-c911-4696-aa99-9dd3e51640f8-kube-api-access-b7857\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790281 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-credential-keys\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790307 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790327 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790359 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.790627 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.796702 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.798699 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.798808 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.808098 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd6fz\" (UniqueName: \"kubernetes.io/projected/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-kube-api-access-nd6fz\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.814626 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.831590 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.892726 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-scripts\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.892825 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-logs\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.892876 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.892957 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-combined-ca-bundle\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.892991 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893031 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893101 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893165 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4g98\" (UniqueName: \"kubernetes.io/projected/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-kube-api-access-d4g98\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-fernet-keys\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893244 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893284 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-config-data\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893324 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7857\" (UniqueName: \"kubernetes.io/projected/922b2387-c911-4696-aa99-9dd3e51640f8-kube-api-access-b7857\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893353 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893359 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893394 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-credential-keys\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893463 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-logs\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.893799 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.896947 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.898386 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.898893 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-scripts\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.899561 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-credential-keys\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.899662 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.901383 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-config-data\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.903407 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-fernet-keys\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.906803 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.906888 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.907236 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-combined-ca-bundle\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.913570 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4g98\" (UniqueName: \"kubernetes.io/projected/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-kube-api-access-d4g98\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.921637 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7857\" (UniqueName: \"kubernetes.io/projected/922b2387-c911-4696-aa99-9dd3e51640f8-kube-api-access-b7857\") pod \"keystone-bootstrap-t55gv\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.930955 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " pod="openstack/glance-default-external-api-0" Sep 30 19:04:45 crc kubenswrapper[4747]: I0930 19:04:45.998160 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:04:46 crc kubenswrapper[4747]: I0930 19:04:46.060107 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:46 crc kubenswrapper[4747]: I0930 19:04:46.362104 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:04:46 crc kubenswrapper[4747]: I0930 19:04:46.422187 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xx25s"] Sep 30 19:04:46 crc kubenswrapper[4747]: I0930 19:04:46.422448 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-xx25s" podUID="826e56d5-0794-46e2-b143-e331bc22358e" containerName="dnsmasq-dns" containerID="cri-o://a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235" gracePeriod=10 Sep 30 19:04:46 crc kubenswrapper[4747]: W0930 19:04:46.537780 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f82abbf_1e09_4f46_97c6_6076a5c3a7e0.slice/crio-438f36f95159ee84847dafd88eff358994ae9c882dbb53014522558179907668 WatchSource:0}: Error finding container 438f36f95159ee84847dafd88eff358994ae9c882dbb53014522558179907668: Status 404 returned error can't find the container with id 438f36f95159ee84847dafd88eff358994ae9c882dbb53014522558179907668 Sep 30 19:04:46 crc kubenswrapper[4747]: I0930 19:04:46.553160 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:04:46 crc kubenswrapper[4747]: I0930 19:04:46.592832 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t55gv"] Sep 30 19:04:46 crc kubenswrapper[4747]: I0930 19:04:46.637908 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:04:46 crc kubenswrapper[4747]: W0930 19:04:46.725291 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod688a1ce6_0861_4c45_9e0c_a94c65a5c5d4.slice/crio-54fd27b3c778951e3f2357c2908e6905f603f7ebb82b83b9147133fe14b98388 WatchSource:0}: Error finding container 54fd27b3c778951e3f2357c2908e6905f603f7ebb82b83b9147133fe14b98388: Status 404 returned error can't find the container with id 54fd27b3c778951e3f2357c2908e6905f603f7ebb82b83b9147133fe14b98388 Sep 30 19:04:46 crc kubenswrapper[4747]: I0930 19:04:46.926575 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-665b-account-create-rsr8w" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.016443 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d52e-account-create-mb42m" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.027171 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf9zd\" (UniqueName: \"kubernetes.io/projected/fd91c92e-badb-4682-b203-7889aeeef868-kube-api-access-lf9zd\") pod \"fd91c92e-badb-4682-b203-7889aeeef868\" (UID: \"fd91c92e-badb-4682-b203-7889aeeef868\") " Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.032903 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd91c92e-badb-4682-b203-7889aeeef868-kube-api-access-lf9zd" (OuterVolumeSpecName: "kube-api-access-lf9zd") pod "fd91c92e-badb-4682-b203-7889aeeef868" (UID: "fd91c92e-badb-4682-b203-7889aeeef868"). InnerVolumeSpecName "kube-api-access-lf9zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.066472 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.115781 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46fece27-faee-4f8e-a347-c15ae7c48426" path="/var/lib/kubelet/pods/46fece27-faee-4f8e-a347-c15ae7c48426/volumes" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.116377 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5451328f-0679-4ef6-9b22-503d6c4f0a67" path="/var/lib/kubelet/pods/5451328f-0679-4ef6-9b22-503d6c4f0a67/volumes" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.119255 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee42841-64d1-46a9-ab0d-bacbfb82adfe" path="/var/lib/kubelet/pods/bee42841-64d1-46a9-ab0d-bacbfb82adfe/volumes" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.128857 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-config\") pod \"826e56d5-0794-46e2-b143-e331bc22358e\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.129033 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-sb\") pod \"826e56d5-0794-46e2-b143-e331bc22358e\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.129061 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl46r\" (UniqueName: \"kubernetes.io/projected/2d86156e-2bee-4e6e-a152-5f2b92d0ed6e-kube-api-access-rl46r\") pod \"2d86156e-2bee-4e6e-a152-5f2b92d0ed6e\" (UID: \"2d86156e-2bee-4e6e-a152-5f2b92d0ed6e\") " Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.129139 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-dns-svc\") pod \"826e56d5-0794-46e2-b143-e331bc22358e\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.129156 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-nb\") pod \"826e56d5-0794-46e2-b143-e331bc22358e\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.129203 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zkfh\" (UniqueName: \"kubernetes.io/projected/826e56d5-0794-46e2-b143-e331bc22358e-kube-api-access-8zkfh\") pod \"826e56d5-0794-46e2-b143-e331bc22358e\" (UID: \"826e56d5-0794-46e2-b143-e331bc22358e\") " Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.129544 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf9zd\" (UniqueName: \"kubernetes.io/projected/fd91c92e-badb-4682-b203-7889aeeef868-kube-api-access-lf9zd\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.133647 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d86156e-2bee-4e6e-a152-5f2b92d0ed6e-kube-api-access-rl46r" (OuterVolumeSpecName: "kube-api-access-rl46r") pod "2d86156e-2bee-4e6e-a152-5f2b92d0ed6e" (UID: "2d86156e-2bee-4e6e-a152-5f2b92d0ed6e"). InnerVolumeSpecName "kube-api-access-rl46r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.143524 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/826e56d5-0794-46e2-b143-e331bc22358e-kube-api-access-8zkfh" (OuterVolumeSpecName: "kube-api-access-8zkfh") pod "826e56d5-0794-46e2-b143-e331bc22358e" (UID: "826e56d5-0794-46e2-b143-e331bc22358e"). InnerVolumeSpecName "kube-api-access-8zkfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.169248 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "826e56d5-0794-46e2-b143-e331bc22358e" (UID: "826e56d5-0794-46e2-b143-e331bc22358e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.172092 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "826e56d5-0794-46e2-b143-e331bc22358e" (UID: "826e56d5-0794-46e2-b143-e331bc22358e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.176505 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-config" (OuterVolumeSpecName: "config") pod "826e56d5-0794-46e2-b143-e331bc22358e" (UID: "826e56d5-0794-46e2-b143-e331bc22358e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.192359 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "826e56d5-0794-46e2-b143-e331bc22358e" (UID: "826e56d5-0794-46e2-b143-e331bc22358e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.231241 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.231300 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.231316 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl46r\" (UniqueName: \"kubernetes.io/projected/2d86156e-2bee-4e6e-a152-5f2b92d0ed6e-kube-api-access-rl46r\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.231327 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.231338 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/826e56d5-0794-46e2-b143-e331bc22358e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.231349 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zkfh\" (UniqueName: \"kubernetes.io/projected/826e56d5-0794-46e2-b143-e331bc22358e-kube-api-access-8zkfh\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.533450 4747 generic.go:334] "Generic (PLEG): container finished" podID="2fcd68f8-070b-4361-841f-acea0b80118f" containerID="62a1c03b04ebeca827ef9b623107b09329c40d008cc6c30a0f7baf1ecc74b7fc" exitCode=0 Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.533540 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h5f4l" event={"ID":"2fcd68f8-070b-4361-841f-acea0b80118f","Type":"ContainerDied","Data":"62a1c03b04ebeca827ef9b623107b09329c40d008cc6c30a0f7baf1ecc74b7fc"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.536099 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d52e-account-create-mb42m" event={"ID":"2d86156e-2bee-4e6e-a152-5f2b92d0ed6e","Type":"ContainerDied","Data":"d45c5fa246d1cce027bd7609718834d90f1b3678c5494f0e88535ac334d3c5cc"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.536130 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45c5fa246d1cce027bd7609718834d90f1b3678c5494f0e88535ac334d3c5cc" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.536161 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d52e-account-create-mb42m" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.549954 4747 generic.go:334] "Generic (PLEG): container finished" podID="826e56d5-0794-46e2-b143-e331bc22358e" containerID="a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235" exitCode=0 Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.550004 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-xx25s" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.550046 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xx25s" event={"ID":"826e56d5-0794-46e2-b143-e331bc22358e","Type":"ContainerDied","Data":"a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.550111 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-xx25s" event={"ID":"826e56d5-0794-46e2-b143-e331bc22358e","Type":"ContainerDied","Data":"5a905a304b728fbe2d502121afe7d0106bf5103b9d67430ff14cf5446605c217"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.550133 4747 scope.go:117] "RemoveContainer" containerID="a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.552377 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0","Type":"ContainerStarted","Data":"2ae2c51de35c0de7257e4e9a5f833a1878d488774559ad44b39c31c302e2937a"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.552411 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0","Type":"ContainerStarted","Data":"438f36f95159ee84847dafd88eff358994ae9c882dbb53014522558179907668"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.554592 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t55gv" event={"ID":"922b2387-c911-4696-aa99-9dd3e51640f8","Type":"ContainerStarted","Data":"9b5db20853331d96d46333e42566488a9f8d438ab6332f0b4743e7ea153f7c03"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.554649 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t55gv" event={"ID":"922b2387-c911-4696-aa99-9dd3e51640f8","Type":"ContainerStarted","Data":"39d14f9091785980f9b7a1c54b71baa906475b124a4f4105d6191d20d2c52d1f"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.559074 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4","Type":"ContainerStarted","Data":"919d6e650959b54fe7556e9a78b03101837d97ef2d2ca35ac0183d02f039361b"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.559106 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4","Type":"ContainerStarted","Data":"54fd27b3c778951e3f2357c2908e6905f603f7ebb82b83b9147133fe14b98388"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.561086 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665b-account-create-rsr8w" event={"ID":"fd91c92e-badb-4682-b203-7889aeeef868","Type":"ContainerDied","Data":"e983d35eaabb23440a1a49ed3f8a657195cc4004b5a5d12366a44180319d6b64"} Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.561125 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e983d35eaabb23440a1a49ed3f8a657195cc4004b5a5d12366a44180319d6b64" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.561132 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-665b-account-create-rsr8w" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.576288 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t55gv" podStartSLOduration=2.576267717 podStartE2EDuration="2.576267717s" podCreationTimestamp="2025-09-30 19:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:47.572076177 +0000 UTC m=+1127.231556311" watchObservedRunningTime="2025-09-30 19:04:47.576267717 +0000 UTC m=+1127.235747831" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.579572 4747 scope.go:117] "RemoveContainer" containerID="10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.596024 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xx25s"] Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.601288 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-xx25s"] Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.605816 4747 scope.go:117] "RemoveContainer" containerID="a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235" Sep 30 19:04:47 crc kubenswrapper[4747]: E0930 19:04:47.606155 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235\": container with ID starting with a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235 not found: ID does not exist" containerID="a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.606194 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235"} err="failed to get container status \"a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235\": rpc error: code = NotFound desc = could not find container \"a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235\": container with ID starting with a4a5c6780257025bc5bd09344e4a3f8997395b31d878d26b944ec669043d3235 not found: ID does not exist" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.606221 4747 scope.go:117] "RemoveContainer" containerID="10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f" Sep 30 19:04:47 crc kubenswrapper[4747]: E0930 19:04:47.606657 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f\": container with ID starting with 10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f not found: ID does not exist" containerID="10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f" Sep 30 19:04:47 crc kubenswrapper[4747]: I0930 19:04:47.606693 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f"} err="failed to get container status \"10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f\": rpc error: code = NotFound desc = could not find container \"10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f\": container with ID starting with 10b47fd4dff45a946812c3ab68180bb4be3b237899197b90f6ec2d64b920c43f not found: ID does not exist" Sep 30 19:04:48 crc kubenswrapper[4747]: I0930 19:04:48.576766 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0","Type":"ContainerStarted","Data":"d596236814f9eaa207e959f914461e79e4719a27bdd84f9494f3fa02a3c9e822"} Sep 30 19:04:48 crc kubenswrapper[4747]: I0930 19:04:48.579360 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4","Type":"ContainerStarted","Data":"c990a0cb5652bc98b3d7e30e8f51aa2beb1461811b7233bb7f25a8635cfb91fd"} Sep 30 19:04:48 crc kubenswrapper[4747]: I0930 19:04:48.600451 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.60043293 podStartE2EDuration="3.60043293s" podCreationTimestamp="2025-09-30 19:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:48.596204419 +0000 UTC m=+1128.255684533" watchObservedRunningTime="2025-09-30 19:04:48.60043293 +0000 UTC m=+1128.259913044" Sep 30 19:04:48 crc kubenswrapper[4747]: I0930 19:04:48.635497 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.635474626 podStartE2EDuration="3.635474626s" podCreationTimestamp="2025-09-30 19:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:48.629142184 +0000 UTC m=+1128.288622298" watchObservedRunningTime="2025-09-30 19:04:48.635474626 +0000 UTC m=+1128.294954740" Sep 30 19:04:48 crc kubenswrapper[4747]: I0930 19:04:48.953428 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.061791 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fcd68f8-070b-4361-841f-acea0b80118f-logs\") pod \"2fcd68f8-070b-4361-841f-acea0b80118f\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.062281 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddnrg\" (UniqueName: \"kubernetes.io/projected/2fcd68f8-070b-4361-841f-acea0b80118f-kube-api-access-ddnrg\") pod \"2fcd68f8-070b-4361-841f-acea0b80118f\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.062502 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-scripts\") pod \"2fcd68f8-070b-4361-841f-acea0b80118f\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.062599 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fcd68f8-070b-4361-841f-acea0b80118f-logs" (OuterVolumeSpecName: "logs") pod "2fcd68f8-070b-4361-841f-acea0b80118f" (UID: "2fcd68f8-070b-4361-841f-acea0b80118f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.062749 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-combined-ca-bundle\") pod \"2fcd68f8-070b-4361-841f-acea0b80118f\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.062907 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-config-data\") pod \"2fcd68f8-070b-4361-841f-acea0b80118f\" (UID: \"2fcd68f8-070b-4361-841f-acea0b80118f\") " Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.063555 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fcd68f8-070b-4361-841f-acea0b80118f-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.069264 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fcd68f8-070b-4361-841f-acea0b80118f-kube-api-access-ddnrg" (OuterVolumeSpecName: "kube-api-access-ddnrg") pod "2fcd68f8-070b-4361-841f-acea0b80118f" (UID: "2fcd68f8-070b-4361-841f-acea0b80118f"). InnerVolumeSpecName "kube-api-access-ddnrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.070750 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-scripts" (OuterVolumeSpecName: "scripts") pod "2fcd68f8-070b-4361-841f-acea0b80118f" (UID: "2fcd68f8-070b-4361-841f-acea0b80118f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.092269 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-config-data" (OuterVolumeSpecName: "config-data") pod "2fcd68f8-070b-4361-841f-acea0b80118f" (UID: "2fcd68f8-070b-4361-841f-acea0b80118f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.105903 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="826e56d5-0794-46e2-b143-e331bc22358e" path="/var/lib/kubelet/pods/826e56d5-0794-46e2-b143-e331bc22358e/volumes" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.106500 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fcd68f8-070b-4361-841f-acea0b80118f" (UID: "2fcd68f8-070b-4361-841f-acea0b80118f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.165053 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddnrg\" (UniqueName: \"kubernetes.io/projected/2fcd68f8-070b-4361-841f-acea0b80118f-kube-api-access-ddnrg\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.165098 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.165112 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.165123 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fcd68f8-070b-4361-841f-acea0b80118f-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.607500 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h5f4l" event={"ID":"2fcd68f8-070b-4361-841f-acea0b80118f","Type":"ContainerDied","Data":"3300aedc0dd6f01bbf3d5f1f6a2d7e40254a05388c7543a7bbf85a359be82c2b"} Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.607599 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3300aedc0dd6f01bbf3d5f1f6a2d7e40254a05388c7543a7bbf85a359be82c2b" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.607542 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h5f4l" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.611331 4747 generic.go:334] "Generic (PLEG): container finished" podID="922b2387-c911-4696-aa99-9dd3e51640f8" containerID="9b5db20853331d96d46333e42566488a9f8d438ab6332f0b4743e7ea153f7c03" exitCode=0 Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.611417 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t55gv" event={"ID":"922b2387-c911-4696-aa99-9dd3e51640f8","Type":"ContainerDied","Data":"9b5db20853331d96d46333e42566488a9f8d438ab6332f0b4743e7ea153f7c03"} Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.850164 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-66b578897b-btxxw"] Sep 30 19:04:49 crc kubenswrapper[4747]: E0930 19:04:49.850566 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826e56d5-0794-46e2-b143-e331bc22358e" containerName="dnsmasq-dns" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.850581 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="826e56d5-0794-46e2-b143-e331bc22358e" containerName="dnsmasq-dns" Sep 30 19:04:49 crc kubenswrapper[4747]: E0930 19:04:49.850600 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826e56d5-0794-46e2-b143-e331bc22358e" containerName="init" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.850608 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="826e56d5-0794-46e2-b143-e331bc22358e" containerName="init" Sep 30 19:04:49 crc kubenswrapper[4747]: E0930 19:04:49.850625 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd91c92e-badb-4682-b203-7889aeeef868" containerName="mariadb-account-create" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.850633 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd91c92e-badb-4682-b203-7889aeeef868" containerName="mariadb-account-create" Sep 30 19:04:49 crc kubenswrapper[4747]: E0930 19:04:49.850656 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d86156e-2bee-4e6e-a152-5f2b92d0ed6e" containerName="mariadb-account-create" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.850679 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d86156e-2bee-4e6e-a152-5f2b92d0ed6e" containerName="mariadb-account-create" Sep 30 19:04:49 crc kubenswrapper[4747]: E0930 19:04:49.850695 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fcd68f8-070b-4361-841f-acea0b80118f" containerName="placement-db-sync" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.850702 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcd68f8-070b-4361-841f-acea0b80118f" containerName="placement-db-sync" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.850885 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fcd68f8-070b-4361-841f-acea0b80118f" containerName="placement-db-sync" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.850913 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd91c92e-badb-4682-b203-7889aeeef868" containerName="mariadb-account-create" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.851331 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d86156e-2bee-4e6e-a152-5f2b92d0ed6e" containerName="mariadb-account-create" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.851352 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="826e56d5-0794-46e2-b143-e331bc22358e" containerName="dnsmasq-dns" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.852364 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.854412 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.854741 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.855198 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.856168 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2jdzz" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.857271 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.883370 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66b578897b-btxxw"] Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.982465 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-config-data\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.982508 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-combined-ca-bundle\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.982540 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-scripts\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.982564 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-logs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.982595 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhjxz\" (UniqueName: \"kubernetes.io/projected/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-kube-api-access-rhjxz\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.982624 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-internal-tls-certs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:49 crc kubenswrapper[4747]: I0930 19:04:49.983654 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-public-tls-certs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.085667 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhjxz\" (UniqueName: \"kubernetes.io/projected/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-kube-api-access-rhjxz\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.085738 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-internal-tls-certs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.085815 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-public-tls-certs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.085870 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-config-data\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.085894 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-combined-ca-bundle\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.085950 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-scripts\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.085985 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-logs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.086604 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-logs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.090440 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-internal-tls-certs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.090658 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-scripts\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.091544 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-config-data\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.094587 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-public-tls-certs\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.094849 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-combined-ca-bundle\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.111146 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhjxz\" (UniqueName: \"kubernetes.io/projected/ea495f4c-cb12-4b75-850e-ef1d64d1f9af-kube-api-access-rhjxz\") pod \"placement-66b578897b-btxxw\" (UID: \"ea495f4c-cb12-4b75-850e-ef1d64d1f9af\") " pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.167479 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.744282 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66b578897b-btxxw"] Sep 30 19:04:50 crc kubenswrapper[4747]: W0930 19:04:50.751582 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea495f4c_cb12_4b75_850e_ef1d64d1f9af.slice/crio-e1628535468b6e64291b0dcd661ddf01c53ac82b3a3ca9b635308df8a18ca265 WatchSource:0}: Error finding container e1628535468b6e64291b0dcd661ddf01c53ac82b3a3ca9b635308df8a18ca265: Status 404 returned error can't find the container with id e1628535468b6e64291b0dcd661ddf01c53ac82b3a3ca9b635308df8a18ca265 Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.902035 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.999186 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-fernet-keys\") pod \"922b2387-c911-4696-aa99-9dd3e51640f8\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.999249 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-scripts\") pod \"922b2387-c911-4696-aa99-9dd3e51640f8\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.999308 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-credential-keys\") pod \"922b2387-c911-4696-aa99-9dd3e51640f8\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.999339 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7857\" (UniqueName: \"kubernetes.io/projected/922b2387-c911-4696-aa99-9dd3e51640f8-kube-api-access-b7857\") pod \"922b2387-c911-4696-aa99-9dd3e51640f8\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.999375 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-combined-ca-bundle\") pod \"922b2387-c911-4696-aa99-9dd3e51640f8\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " Sep 30 19:04:50 crc kubenswrapper[4747]: I0930 19:04:50.999474 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-config-data\") pod \"922b2387-c911-4696-aa99-9dd3e51640f8\" (UID: \"922b2387-c911-4696-aa99-9dd3e51640f8\") " Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.002908 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "922b2387-c911-4696-aa99-9dd3e51640f8" (UID: "922b2387-c911-4696-aa99-9dd3e51640f8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.003270 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/922b2387-c911-4696-aa99-9dd3e51640f8-kube-api-access-b7857" (OuterVolumeSpecName: "kube-api-access-b7857") pod "922b2387-c911-4696-aa99-9dd3e51640f8" (UID: "922b2387-c911-4696-aa99-9dd3e51640f8"). InnerVolumeSpecName "kube-api-access-b7857". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.003395 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-scripts" (OuterVolumeSpecName: "scripts") pod "922b2387-c911-4696-aa99-9dd3e51640f8" (UID: "922b2387-c911-4696-aa99-9dd3e51640f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.005223 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "922b2387-c911-4696-aa99-9dd3e51640f8" (UID: "922b2387-c911-4696-aa99-9dd3e51640f8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.023753 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "922b2387-c911-4696-aa99-9dd3e51640f8" (UID: "922b2387-c911-4696-aa99-9dd3e51640f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.031992 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-config-data" (OuterVolumeSpecName: "config-data") pod "922b2387-c911-4696-aa99-9dd3e51640f8" (UID: "922b2387-c911-4696-aa99-9dd3e51640f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.116962 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.117222 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.117232 4747 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.117241 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.117249 4747 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/922b2387-c911-4696-aa99-9dd3e51640f8-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.117260 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7857\" (UniqueName: \"kubernetes.io/projected/922b2387-c911-4696-aa99-9dd3e51640f8-kube-api-access-b7857\") on node \"crc\" DevicePath \"\"" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.638776 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66b578897b-btxxw" event={"ID":"ea495f4c-cb12-4b75-850e-ef1d64d1f9af","Type":"ContainerStarted","Data":"6a1ca3b407b3f122a9795e1d3b429e92776cd7b2187834c08ccd2ae384f8f635"} Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.638843 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66b578897b-btxxw" event={"ID":"ea495f4c-cb12-4b75-850e-ef1d64d1f9af","Type":"ContainerStarted","Data":"de4a6f12e2cb4c0b440f9d6d069c1d6bf066cdbf485e57481970b5b7cd9e3afd"} Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.638861 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66b578897b-btxxw" event={"ID":"ea495f4c-cb12-4b75-850e-ef1d64d1f9af","Type":"ContainerStarted","Data":"e1628535468b6e64291b0dcd661ddf01c53ac82b3a3ca9b635308df8a18ca265"} Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.640253 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.640473 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66b578897b-btxxw" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.640678 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t55gv" event={"ID":"922b2387-c911-4696-aa99-9dd3e51640f8","Type":"ContainerDied","Data":"39d14f9091785980f9b7a1c54b71baa906475b124a4f4105d6191d20d2c52d1f"} Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.640738 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39d14f9091785980f9b7a1c54b71baa906475b124a4f4105d6191d20d2c52d1f" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.642725 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t55gv" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.678887 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-66b578897b-btxxw" podStartSLOduration=2.678862768 podStartE2EDuration="2.678862768s" podCreationTimestamp="2025-09-30 19:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:51.670664823 +0000 UTC m=+1131.330144977" watchObservedRunningTime="2025-09-30 19:04:51.678862768 +0000 UTC m=+1131.338342912" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.791823 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-648477b6b5-24gx7"] Sep 30 19:04:51 crc kubenswrapper[4747]: E0930 19:04:51.792259 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922b2387-c911-4696-aa99-9dd3e51640f8" containerName="keystone-bootstrap" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.792276 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="922b2387-c911-4696-aa99-9dd3e51640f8" containerName="keystone-bootstrap" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.792537 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="922b2387-c911-4696-aa99-9dd3e51640f8" containerName="keystone-bootstrap" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.793280 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.797592 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.797844 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.798017 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q4ln8" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.798325 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.798433 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.798536 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.818696 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-648477b6b5-24gx7"] Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.827821 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-credential-keys\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.827856 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpqfs\" (UniqueName: \"kubernetes.io/projected/333eedad-3727-4187-b392-e4c4d71bc2d1-kube-api-access-qpqfs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.827901 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-internal-tls-certs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.827946 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-config-data\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.828038 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-combined-ca-bundle\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.828055 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-fernet-keys\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.828100 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-public-tls-certs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.828142 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-scripts\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.929636 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-combined-ca-bundle\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.929923 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-fernet-keys\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.930058 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-public-tls-certs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.930186 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-scripts\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.930288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-credential-keys\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.930397 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpqfs\" (UniqueName: \"kubernetes.io/projected/333eedad-3727-4187-b392-e4c4d71bc2d1-kube-api-access-qpqfs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.930510 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-internal-tls-certs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.930621 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-config-data\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.934737 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-fernet-keys\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.934879 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-public-tls-certs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.935221 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-combined-ca-bundle\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.935466 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-internal-tls-certs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.936247 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-scripts\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.937378 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-credential-keys\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.937663 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333eedad-3727-4187-b392-e4c4d71bc2d1-config-data\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:51 crc kubenswrapper[4747]: I0930 19:04:51.953494 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpqfs\" (UniqueName: \"kubernetes.io/projected/333eedad-3727-4187-b392-e4c4d71bc2d1-kube-api-access-qpqfs\") pod \"keystone-648477b6b5-24gx7\" (UID: \"333eedad-3727-4187-b392-e4c4d71bc2d1\") " pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.117688 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.638986 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-648477b6b5-24gx7"] Sep 30 19:04:52 crc kubenswrapper[4747]: W0930 19:04:52.645155 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod333eedad_3727_4187_b392_e4c4d71bc2d1.slice/crio-a7b42810138cee6b0382a2ef2d8f053c4529f902dfa9c6a9a77bac713b074f99 WatchSource:0}: Error finding container a7b42810138cee6b0382a2ef2d8f053c4529f902dfa9c6a9a77bac713b074f99: Status 404 returned error can't find the container with id a7b42810138cee6b0382a2ef2d8f053c4529f902dfa9c6a9a77bac713b074f99 Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.911122 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-kk26j"] Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.912498 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.914334 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-49fwb" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.915133 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.915769 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.928579 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-kk26j"] Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.947772 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zl7\" (UniqueName: \"kubernetes.io/projected/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-kube-api-access-94zl7\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.947807 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-config-data\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.947839 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-scripts\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.947858 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-db-sync-config-data\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.947881 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-combined-ca-bundle\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:52 crc kubenswrapper[4747]: I0930 19:04:52.948027 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-etc-machine-id\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.049117 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-etc-machine-id\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.049209 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zl7\" (UniqueName: \"kubernetes.io/projected/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-kube-api-access-94zl7\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.049235 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-config-data\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.049238 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-etc-machine-id\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.049257 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-scripts\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.049300 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-db-sync-config-data\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.049342 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-combined-ca-bundle\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.055240 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-combined-ca-bundle\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.056772 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-scripts\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.057452 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-config-data\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.057873 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-db-sync-config-data\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.064577 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zl7\" (UniqueName: \"kubernetes.io/projected/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-kube-api-access-94zl7\") pod \"cinder-db-sync-kk26j\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.100246 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vpn6w"] Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.101320 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.103129 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-b7mzs" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.103376 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.103374 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.124079 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vpn6w"] Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.150754 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgjhd\" (UniqueName: \"kubernetes.io/projected/69e42307-0f32-4155-bfa3-57d4de734daa-kube-api-access-rgjhd\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.150848 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-combined-ca-bundle\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.150892 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-config\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.242910 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kk26j" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.252816 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgjhd\" (UniqueName: \"kubernetes.io/projected/69e42307-0f32-4155-bfa3-57d4de734daa-kube-api-access-rgjhd\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.252945 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-combined-ca-bundle\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.252989 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-config\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.268810 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-combined-ca-bundle\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.269997 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-config\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.272458 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgjhd\" (UniqueName: \"kubernetes.io/projected/69e42307-0f32-4155-bfa3-57d4de734daa-kube-api-access-rgjhd\") pod \"neutron-db-sync-vpn6w\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.418737 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.663539 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vpn6w"] Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.664605 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-648477b6b5-24gx7" event={"ID":"333eedad-3727-4187-b392-e4c4d71bc2d1","Type":"ContainerStarted","Data":"dd84a2adcea999394196c64bbea0848358adc0bcd18f97b2bdd4a1d055125f91"} Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.664647 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-648477b6b5-24gx7" event={"ID":"333eedad-3727-4187-b392-e4c4d71bc2d1","Type":"ContainerStarted","Data":"a7b42810138cee6b0382a2ef2d8f053c4529f902dfa9c6a9a77bac713b074f99"} Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.664834 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.690854 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-648477b6b5-24gx7" podStartSLOduration=2.690827589 podStartE2EDuration="2.690827589s" podCreationTimestamp="2025-09-30 19:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:53.682940353 +0000 UTC m=+1133.342420467" watchObservedRunningTime="2025-09-30 19:04:53.690827589 +0000 UTC m=+1133.350307713" Sep 30 19:04:53 crc kubenswrapper[4747]: I0930 19:04:53.758119 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-kk26j"] Sep 30 19:04:53 crc kubenswrapper[4747]: W0930 19:04:53.760566 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fbc3a67_c4e2_4f47_9ed0_bcefdc423407.slice/crio-3084f6afbe8a1b38eda6204871138c9329edeb1540c9e2008f8a4ef5898ec3aa WatchSource:0}: Error finding container 3084f6afbe8a1b38eda6204871138c9329edeb1540c9e2008f8a4ef5898ec3aa: Status 404 returned error can't find the container with id 3084f6afbe8a1b38eda6204871138c9329edeb1540c9e2008f8a4ef5898ec3aa Sep 30 19:04:54 crc kubenswrapper[4747]: I0930 19:04:54.675623 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vpn6w" event={"ID":"69e42307-0f32-4155-bfa3-57d4de734daa","Type":"ContainerStarted","Data":"7da630a819c95600ab6a65d26a5f43bb0e2c342cce2fe5d3d9907821645c2ccb"} Sep 30 19:04:54 crc kubenswrapper[4747]: I0930 19:04:54.675970 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vpn6w" event={"ID":"69e42307-0f32-4155-bfa3-57d4de734daa","Type":"ContainerStarted","Data":"3e53a52f8a227ca8967a239bc50b9b5f4024cc2b0f0d2955204d4bb2152a2a6e"} Sep 30 19:04:54 crc kubenswrapper[4747]: I0930 19:04:54.677853 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kk26j" event={"ID":"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407","Type":"ContainerStarted","Data":"3084f6afbe8a1b38eda6204871138c9329edeb1540c9e2008f8a4ef5898ec3aa"} Sep 30 19:04:54 crc kubenswrapper[4747]: I0930 19:04:54.695982 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vpn6w" podStartSLOduration=1.695965006 podStartE2EDuration="1.695965006s" podCreationTimestamp="2025-09-30 19:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:04:54.694911196 +0000 UTC m=+1134.354391300" watchObservedRunningTime="2025-09-30 19:04:54.695965006 +0000 UTC m=+1134.355445120" Sep 30 19:04:55 crc kubenswrapper[4747]: I0930 19:04:55.907102 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:55 crc kubenswrapper[4747]: I0930 19:04:55.907467 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:55 crc kubenswrapper[4747]: I0930 19:04:55.967694 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:55 crc kubenswrapper[4747]: I0930 19:04:55.968764 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:56 crc kubenswrapper[4747]: I0930 19:04:56.000110 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 19:04:56 crc kubenswrapper[4747]: I0930 19:04:56.000206 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 19:04:56 crc kubenswrapper[4747]: I0930 19:04:56.041052 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 19:04:56 crc kubenswrapper[4747]: I0930 19:04:56.050609 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 19:04:56 crc kubenswrapper[4747]: I0930 19:04:56.697238 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 19:04:56 crc kubenswrapper[4747]: I0930 19:04:56.697282 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 19:04:56 crc kubenswrapper[4747]: I0930 19:04:56.697295 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:56 crc kubenswrapper[4747]: I0930 19:04:56.697307 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:58 crc kubenswrapper[4747]: I0930 19:04:58.540633 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:58 crc kubenswrapper[4747]: I0930 19:04:58.563965 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 19:04:58 crc kubenswrapper[4747]: I0930 19:04:58.629627 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 19:04:58 crc kubenswrapper[4747]: I0930 19:04:58.682395 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 19:05:08 crc kubenswrapper[4747]: I0930 19:05:08.834776 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kk26j" event={"ID":"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407","Type":"ContainerStarted","Data":"ca8b6a89f13a5d2a6df2339f3a8a096caf7e891371d6692b121f0fd805176a77"} Sep 30 19:05:08 crc kubenswrapper[4747]: I0930 19:05:08.841575 4747 generic.go:334] "Generic (PLEG): container finished" podID="69e42307-0f32-4155-bfa3-57d4de734daa" containerID="7da630a819c95600ab6a65d26a5f43bb0e2c342cce2fe5d3d9907821645c2ccb" exitCode=0 Sep 30 19:05:08 crc kubenswrapper[4747]: I0930 19:05:08.841638 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vpn6w" event={"ID":"69e42307-0f32-4155-bfa3-57d4de734daa","Type":"ContainerDied","Data":"7da630a819c95600ab6a65d26a5f43bb0e2c342cce2fe5d3d9907821645c2ccb"} Sep 30 19:05:08 crc kubenswrapper[4747]: I0930 19:05:08.865401 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-kk26j" podStartSLOduration=2.77306548 podStartE2EDuration="16.865385554s" podCreationTimestamp="2025-09-30 19:04:52 +0000 UTC" firstStartedPulling="2025-09-30 19:04:53.762215309 +0000 UTC m=+1133.421695423" lastFinishedPulling="2025-09-30 19:05:07.854535383 +0000 UTC m=+1147.514015497" observedRunningTime="2025-09-30 19:05:08.862416889 +0000 UTC m=+1148.521897093" watchObservedRunningTime="2025-09-30 19:05:08.865385554 +0000 UTC m=+1148.524865668" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.290906 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.375262 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-config\") pod \"69e42307-0f32-4155-bfa3-57d4de734daa\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.375351 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgjhd\" (UniqueName: \"kubernetes.io/projected/69e42307-0f32-4155-bfa3-57d4de734daa-kube-api-access-rgjhd\") pod \"69e42307-0f32-4155-bfa3-57d4de734daa\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.375469 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-combined-ca-bundle\") pod \"69e42307-0f32-4155-bfa3-57d4de734daa\" (UID: \"69e42307-0f32-4155-bfa3-57d4de734daa\") " Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.382666 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e42307-0f32-4155-bfa3-57d4de734daa-kube-api-access-rgjhd" (OuterVolumeSpecName: "kube-api-access-rgjhd") pod "69e42307-0f32-4155-bfa3-57d4de734daa" (UID: "69e42307-0f32-4155-bfa3-57d4de734daa"). InnerVolumeSpecName "kube-api-access-rgjhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.402428 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69e42307-0f32-4155-bfa3-57d4de734daa" (UID: "69e42307-0f32-4155-bfa3-57d4de734daa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.405265 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-config" (OuterVolumeSpecName: "config") pod "69e42307-0f32-4155-bfa3-57d4de734daa" (UID: "69e42307-0f32-4155-bfa3-57d4de734daa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.477162 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.477359 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgjhd\" (UniqueName: \"kubernetes.io/projected/69e42307-0f32-4155-bfa3-57d4de734daa-kube-api-access-rgjhd\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.477421 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e42307-0f32-4155-bfa3-57d4de734daa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.868207 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vpn6w" event={"ID":"69e42307-0f32-4155-bfa3-57d4de734daa","Type":"ContainerDied","Data":"3e53a52f8a227ca8967a239bc50b9b5f4024cc2b0f0d2955204d4bb2152a2a6e"} Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.868559 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e53a52f8a227ca8967a239bc50b9b5f4024cc2b0f0d2955204d4bb2152a2a6e" Sep 30 19:05:10 crc kubenswrapper[4747]: I0930 19:05:10.868298 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vpn6w" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.067788 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-kqv9n"] Sep 30 19:05:11 crc kubenswrapper[4747]: E0930 19:05:11.068230 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e42307-0f32-4155-bfa3-57d4de734daa" containerName="neutron-db-sync" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.068250 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e42307-0f32-4155-bfa3-57d4de734daa" containerName="neutron-db-sync" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.068425 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e42307-0f32-4155-bfa3-57d4de734daa" containerName="neutron-db-sync" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.069291 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.083495 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-kqv9n"] Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.167453 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db8dcfd56-rp5sl"] Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.169130 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.171452 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.171501 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.171675 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-b7mzs" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.171872 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.185122 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db8dcfd56-rp5sl"] Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.190052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcgf5\" (UniqueName: \"kubernetes.io/projected/260df626-9266-4d41-ad27-630a5aa58cdf-kube-api-access-mcgf5\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.190152 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-config\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.190197 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.190231 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.190277 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292363 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-config\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292421 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292456 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292482 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-ovndb-tls-certs\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292521 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292557 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-combined-ca-bundle\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292581 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz6c9\" (UniqueName: \"kubernetes.io/projected/94976787-0fa3-4e10-9902-1918b9512f30-kube-api-access-kz6c9\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292597 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-httpd-config\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292616 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcgf5\" (UniqueName: \"kubernetes.io/projected/260df626-9266-4d41-ad27-630a5aa58cdf-kube-api-access-mcgf5\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.292635 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-config\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.293780 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.293814 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-config\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.293852 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.293961 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.312643 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcgf5\" (UniqueName: \"kubernetes.io/projected/260df626-9266-4d41-ad27-630a5aa58cdf-kube-api-access-mcgf5\") pod \"dnsmasq-dns-5f66db59b9-kqv9n\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.387850 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.394340 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-ovndb-tls-certs\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.394441 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-combined-ca-bundle\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.394479 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz6c9\" (UniqueName: \"kubernetes.io/projected/94976787-0fa3-4e10-9902-1918b9512f30-kube-api-access-kz6c9\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.394500 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-httpd-config\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.394530 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-config\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.398250 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-config\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.398311 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-combined-ca-bundle\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.400646 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-httpd-config\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.410594 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-ovndb-tls-certs\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.414118 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz6c9\" (UniqueName: \"kubernetes.io/projected/94976787-0fa3-4e10-9902-1918b9512f30-kube-api-access-kz6c9\") pod \"neutron-db8dcfd56-rp5sl\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.487062 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:11 crc kubenswrapper[4747]: I0930 19:05:11.902492 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-kqv9n"] Sep 30 19:05:11 crc kubenswrapper[4747]: W0930 19:05:11.904134 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod260df626_9266_4d41_ad27_630a5aa58cdf.slice/crio-a48abf400b6e2792e19406f10443381a6ff74bea256c92925d9aa8359313902b WatchSource:0}: Error finding container a48abf400b6e2792e19406f10443381a6ff74bea256c92925d9aa8359313902b: Status 404 returned error can't find the container with id a48abf400b6e2792e19406f10443381a6ff74bea256c92925d9aa8359313902b Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.060902 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db8dcfd56-rp5sl"] Sep 30 19:05:12 crc kubenswrapper[4747]: W0930 19:05:12.062284 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94976787_0fa3_4e10_9902_1918b9512f30.slice/crio-5f27af16864ce7f68130408cfc1325e974b2fd43c4b0bf46724d4dc4f59be3e4 WatchSource:0}: Error finding container 5f27af16864ce7f68130408cfc1325e974b2fd43c4b0bf46724d4dc4f59be3e4: Status 404 returned error can't find the container with id 5f27af16864ce7f68130408cfc1325e974b2fd43c4b0bf46724d4dc4f59be3e4 Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.885503 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db8dcfd56-rp5sl" event={"ID":"94976787-0fa3-4e10-9902-1918b9512f30","Type":"ContainerStarted","Data":"ff516d0918a894dbef5bd7e6c1cc4a8bdcfe9e38ed9a3aa7c4c55355b6dcdb0b"} Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.885963 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.885988 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db8dcfd56-rp5sl" event={"ID":"94976787-0fa3-4e10-9902-1918b9512f30","Type":"ContainerStarted","Data":"5ae68ca2574ead6cd58662940d59695ac06f879053d563bebf9d30a87e675907"} Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.886007 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db8dcfd56-rp5sl" event={"ID":"94976787-0fa3-4e10-9902-1918b9512f30","Type":"ContainerStarted","Data":"5f27af16864ce7f68130408cfc1325e974b2fd43c4b0bf46724d4dc4f59be3e4"} Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.887093 4747 generic.go:334] "Generic (PLEG): container finished" podID="8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" containerID="ca8b6a89f13a5d2a6df2339f3a8a096caf7e891371d6692b121f0fd805176a77" exitCode=0 Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.887175 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kk26j" event={"ID":"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407","Type":"ContainerDied","Data":"ca8b6a89f13a5d2a6df2339f3a8a096caf7e891371d6692b121f0fd805176a77"} Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.888640 4747 generic.go:334] "Generic (PLEG): container finished" podID="260df626-9266-4d41-ad27-630a5aa58cdf" containerID="91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8" exitCode=0 Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.888671 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" event={"ID":"260df626-9266-4d41-ad27-630a5aa58cdf","Type":"ContainerDied","Data":"91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8"} Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.888708 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" event={"ID":"260df626-9266-4d41-ad27-630a5aa58cdf","Type":"ContainerStarted","Data":"a48abf400b6e2792e19406f10443381a6ff74bea256c92925d9aa8359313902b"} Sep 30 19:05:12 crc kubenswrapper[4747]: I0930 19:05:12.906180 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db8dcfd56-rp5sl" podStartSLOduration=1.90615607 podStartE2EDuration="1.90615607s" podCreationTimestamp="2025-09-30 19:05:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:12.901656181 +0000 UTC m=+1152.561136305" watchObservedRunningTime="2025-09-30 19:05:12.90615607 +0000 UTC m=+1152.565636204" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.132272 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5677cbcc7-67jtn"] Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.138895 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.142248 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.142409 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.151115 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5677cbcc7-67jtn"] Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.223585 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqxlf\" (UniqueName: \"kubernetes.io/projected/89f566eb-95f6-490d-a9be-7995bc1dbd4f-kube-api-access-qqxlf\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.223667 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-httpd-config\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.223727 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-internal-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.223749 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-combined-ca-bundle\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.223767 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-config\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.223812 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-public-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.224204 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-ovndb-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.325425 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-public-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.326054 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-ovndb-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.326093 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqxlf\" (UniqueName: \"kubernetes.io/projected/89f566eb-95f6-490d-a9be-7995bc1dbd4f-kube-api-access-qqxlf\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.326134 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-httpd-config\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.326173 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-internal-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.326195 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-combined-ca-bundle\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.326213 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-config\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.331023 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-ovndb-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.331204 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-httpd-config\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.333275 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-config\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.334660 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-internal-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.347769 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-combined-ca-bundle\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.348500 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89f566eb-95f6-490d-a9be-7995bc1dbd4f-public-tls-certs\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.357465 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqxlf\" (UniqueName: \"kubernetes.io/projected/89f566eb-95f6-490d-a9be-7995bc1dbd4f-kube-api-access-qqxlf\") pod \"neutron-5677cbcc7-67jtn\" (UID: \"89f566eb-95f6-490d-a9be-7995bc1dbd4f\") " pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.464336 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.815492 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5677cbcc7-67jtn"] Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.904736 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" event={"ID":"260df626-9266-4d41-ad27-630a5aa58cdf","Type":"ContainerStarted","Data":"2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97"} Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.904796 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.912996 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5677cbcc7-67jtn" event={"ID":"89f566eb-95f6-490d-a9be-7995bc1dbd4f","Type":"ContainerStarted","Data":"062ab7551bfb45428f18d77cd063c2fff854bee29d29a84eb97bf6e4b21eb813"} Sep 30 19:05:13 crc kubenswrapper[4747]: I0930 19:05:13.946673 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" podStartSLOduration=2.946647672 podStartE2EDuration="2.946647672s" podCreationTimestamp="2025-09-30 19:05:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:13.933555826 +0000 UTC m=+1153.593035950" watchObservedRunningTime="2025-09-30 19:05:13.946647672 +0000 UTC m=+1153.606127796" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.173770 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kk26j" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.252539 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94zl7\" (UniqueName: \"kubernetes.io/projected/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-kube-api-access-94zl7\") pod \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.252607 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-scripts\") pod \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.252659 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-db-sync-config-data\") pod \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.252707 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-config-data\") pod \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.252772 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-etc-machine-id\") pod \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.252843 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-combined-ca-bundle\") pod \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\" (UID: \"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407\") " Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.253060 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" (UID: "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.256709 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" (UID: "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.257055 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-kube-api-access-94zl7" (OuterVolumeSpecName: "kube-api-access-94zl7") pod "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" (UID: "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407"). InnerVolumeSpecName "kube-api-access-94zl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.257407 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-scripts" (OuterVolumeSpecName: "scripts") pod "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" (UID: "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.277357 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" (UID: "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.319543 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-config-data" (OuterVolumeSpecName: "config-data") pod "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" (UID: "8fbc3a67-c4e2-4f47-9ed0-bcefdc423407"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.355775 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.355959 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94zl7\" (UniqueName: \"kubernetes.io/projected/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-kube-api-access-94zl7\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.356063 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.356086 4747 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.356104 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.356122 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.928348 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kk26j" event={"ID":"8fbc3a67-c4e2-4f47-9ed0-bcefdc423407","Type":"ContainerDied","Data":"3084f6afbe8a1b38eda6204871138c9329edeb1540c9e2008f8a4ef5898ec3aa"} Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.928708 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3084f6afbe8a1b38eda6204871138c9329edeb1540c9e2008f8a4ef5898ec3aa" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.928797 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kk26j" Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.939857 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5677cbcc7-67jtn" event={"ID":"89f566eb-95f6-490d-a9be-7995bc1dbd4f","Type":"ContainerStarted","Data":"1d613ca86735eca588c18f4067bdf714f51e6ed57bd487bf66b3b12c5b261e10"} Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.939983 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5677cbcc7-67jtn" event={"ID":"89f566eb-95f6-490d-a9be-7995bc1dbd4f","Type":"ContainerStarted","Data":"20b31cb7e17d728a57a29f4d7d5be32132a0369501aa494ba77301e8b7f60a60"} Sep 30 19:05:14 crc kubenswrapper[4747]: I0930 19:05:14.990507 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5677cbcc7-67jtn" podStartSLOduration=1.990480089 podStartE2EDuration="1.990480089s" podCreationTimestamp="2025-09-30 19:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:14.967467058 +0000 UTC m=+1154.626947202" watchObservedRunningTime="2025-09-30 19:05:14.990480089 +0000 UTC m=+1154.649960243" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.204090 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:15 crc kubenswrapper[4747]: E0930 19:05:15.204681 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" containerName="cinder-db-sync" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.204726 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" containerName="cinder-db-sync" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.205051 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" containerName="cinder-db-sync" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.206717 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.210327 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-49fwb" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.210670 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.210807 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.210916 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.237196 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.258677 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-kqv9n"] Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.284552 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85fcfdb47c-pn5mf"] Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.293504 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.294691 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.294742 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.294765 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.294784 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.294807 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.294840 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vht2m\" (UniqueName: \"kubernetes.io/projected/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-kube-api-access-vht2m\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.310482 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85fcfdb47c-pn5mf"] Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.403889 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx95d\" (UniqueName: \"kubernetes.io/projected/095a8eef-72de-4bd9-a435-c2b7ef2e832e-kube-api-access-nx95d\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.403942 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.403974 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-config\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.404006 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-dns-svc\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.404024 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.404045 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.404065 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.404081 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.404103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vht2m\" (UniqueName: \"kubernetes.io/projected/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-kube-api-access-vht2m\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.404133 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-sb\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.404162 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-nb\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.408547 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.410825 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.413439 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.414643 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.415941 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.418400 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.419776 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.427351 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.428156 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.434572 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vht2m\" (UniqueName: \"kubernetes.io/projected/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-kube-api-access-vht2m\") pod \"cinder-scheduler-0\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.505915 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-config\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.505984 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506013 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-dns-svc\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506045 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vkkv\" (UniqueName: \"kubernetes.io/projected/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-kube-api-access-5vkkv\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506088 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-sb\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506105 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-scripts\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506126 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506147 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-nb\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506350 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506384 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506431 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-logs\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506531 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx95d\" (UniqueName: \"kubernetes.io/projected/095a8eef-72de-4bd9-a435-c2b7ef2e832e-kube-api-access-nx95d\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506675 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-config\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506697 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-dns-svc\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.506907 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-nb\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.507206 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-sb\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.525521 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx95d\" (UniqueName: \"kubernetes.io/projected/095a8eef-72de-4bd9-a435-c2b7ef2e832e-kube-api-access-nx95d\") pod \"dnsmasq-dns-85fcfdb47c-pn5mf\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.564267 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.608072 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-scripts\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.608115 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.608146 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.608161 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.608181 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-logs\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.608241 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.608273 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vkkv\" (UniqueName: \"kubernetes.io/projected/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-kube-api-access-5vkkv\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.608821 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-logs\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.609671 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.614334 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-scripts\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.614677 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.621148 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.628483 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vkkv\" (UniqueName: \"kubernetes.io/projected/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-kube-api-access-5vkkv\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.628745 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.630653 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.810528 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.950010 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" podUID="260df626-9266-4d41-ad27-630a5aa58cdf" containerName="dnsmasq-dns" containerID="cri-o://2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97" gracePeriod=10 Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.950127 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:15 crc kubenswrapper[4747]: I0930 19:05:15.999286 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85fcfdb47c-pn5mf"] Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.064204 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:16 crc kubenswrapper[4747]: W0930 19:05:16.065198 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod095a8eef_72de_4bd9_a435_c2b7ef2e832e.slice/crio-888a137d279163f15def566c4ebb9ed2992074b4653480d6f44e4ed2e87e70bf WatchSource:0}: Error finding container 888a137d279163f15def566c4ebb9ed2992074b4653480d6f44e4ed2e87e70bf: Status 404 returned error can't find the container with id 888a137d279163f15def566c4ebb9ed2992074b4653480d6f44e4ed2e87e70bf Sep 30 19:05:16 crc kubenswrapper[4747]: W0930 19:05:16.068014 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d6c0e23_33ea_4e36_bba3_63cdbc10089b.slice/crio-7e504fbb5bed641a5fc6bda666aa3d5b53ac23a26b9b98a0813adb4d21aa8e26 WatchSource:0}: Error finding container 7e504fbb5bed641a5fc6bda666aa3d5b53ac23a26b9b98a0813adb4d21aa8e26: Status 404 returned error can't find the container with id 7e504fbb5bed641a5fc6bda666aa3d5b53ac23a26b9b98a0813adb4d21aa8e26 Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.151089 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:16 crc kubenswrapper[4747]: W0930 19:05:16.159517 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fee1df4_d3f4_45a8_82a5_dfb6496bf3f3.slice/crio-f1c8ddedb20344066ca9cf9672132b34d67812f3c076f166961d90b64692434e WatchSource:0}: Error finding container f1c8ddedb20344066ca9cf9672132b34d67812f3c076f166961d90b64692434e: Status 404 returned error can't find the container with id f1c8ddedb20344066ca9cf9672132b34d67812f3c076f166961d90b64692434e Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.393916 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.530908 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-nb\") pod \"260df626-9266-4d41-ad27-630a5aa58cdf\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.531043 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-sb\") pod \"260df626-9266-4d41-ad27-630a5aa58cdf\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.531348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcgf5\" (UniqueName: \"kubernetes.io/projected/260df626-9266-4d41-ad27-630a5aa58cdf-kube-api-access-mcgf5\") pod \"260df626-9266-4d41-ad27-630a5aa58cdf\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.531489 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-config\") pod \"260df626-9266-4d41-ad27-630a5aa58cdf\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.531728 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-dns-svc\") pod \"260df626-9266-4d41-ad27-630a5aa58cdf\" (UID: \"260df626-9266-4d41-ad27-630a5aa58cdf\") " Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.536205 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/260df626-9266-4d41-ad27-630a5aa58cdf-kube-api-access-mcgf5" (OuterVolumeSpecName: "kube-api-access-mcgf5") pod "260df626-9266-4d41-ad27-630a5aa58cdf" (UID: "260df626-9266-4d41-ad27-630a5aa58cdf"). InnerVolumeSpecName "kube-api-access-mcgf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.588677 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-config" (OuterVolumeSpecName: "config") pod "260df626-9266-4d41-ad27-630a5aa58cdf" (UID: "260df626-9266-4d41-ad27-630a5aa58cdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.594978 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "260df626-9266-4d41-ad27-630a5aa58cdf" (UID: "260df626-9266-4d41-ad27-630a5aa58cdf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.596647 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "260df626-9266-4d41-ad27-630a5aa58cdf" (UID: "260df626-9266-4d41-ad27-630a5aa58cdf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.611822 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "260df626-9266-4d41-ad27-630a5aa58cdf" (UID: "260df626-9266-4d41-ad27-630a5aa58cdf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.644992 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.645372 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.645386 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.645401 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/260df626-9266-4d41-ad27-630a5aa58cdf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.645414 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcgf5\" (UniqueName: \"kubernetes.io/projected/260df626-9266-4d41-ad27-630a5aa58cdf-kube-api-access-mcgf5\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.961604 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3","Type":"ContainerStarted","Data":"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d"} Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.961644 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3","Type":"ContainerStarted","Data":"f1c8ddedb20344066ca9cf9672132b34d67812f3c076f166961d90b64692434e"} Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.963825 4747 generic.go:334] "Generic (PLEG): container finished" podID="260df626-9266-4d41-ad27-630a5aa58cdf" containerID="2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97" exitCode=0 Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.963891 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.963922 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" event={"ID":"260df626-9266-4d41-ad27-630a5aa58cdf","Type":"ContainerDied","Data":"2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97"} Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.964303 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-kqv9n" event={"ID":"260df626-9266-4d41-ad27-630a5aa58cdf","Type":"ContainerDied","Data":"a48abf400b6e2792e19406f10443381a6ff74bea256c92925d9aa8359313902b"} Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.964331 4747 scope.go:117] "RemoveContainer" containerID="2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97" Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.965899 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d6c0e23-33ea-4e36-bba3-63cdbc10089b","Type":"ContainerStarted","Data":"7e504fbb5bed641a5fc6bda666aa3d5b53ac23a26b9b98a0813adb4d21aa8e26"} Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.969840 4747 generic.go:334] "Generic (PLEG): container finished" podID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" containerID="c578be2151e70e7ae5c5dbbc2e5ce3c2cdb49fb851e8a025e80b514d0a150885" exitCode=0 Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.971051 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" event={"ID":"095a8eef-72de-4bd9-a435-c2b7ef2e832e","Type":"ContainerDied","Data":"c578be2151e70e7ae5c5dbbc2e5ce3c2cdb49fb851e8a025e80b514d0a150885"} Sep 30 19:05:16 crc kubenswrapper[4747]: I0930 19:05:16.971116 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" event={"ID":"095a8eef-72de-4bd9-a435-c2b7ef2e832e","Type":"ContainerStarted","Data":"888a137d279163f15def566c4ebb9ed2992074b4653480d6f44e4ed2e87e70bf"} Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.018323 4747 scope.go:117] "RemoveContainer" containerID="91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8" Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.023108 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-kqv9n"] Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.037486 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-kqv9n"] Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.069792 4747 scope.go:117] "RemoveContainer" containerID="2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97" Sep 30 19:05:17 crc kubenswrapper[4747]: E0930 19:05:17.070351 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97\": container with ID starting with 2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97 not found: ID does not exist" containerID="2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97" Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.070393 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97"} err="failed to get container status \"2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97\": rpc error: code = NotFound desc = could not find container \"2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97\": container with ID starting with 2b58c07c49af062c9338d93459f3d60a46ccf97798d64475120ec0c0f4b82c97 not found: ID does not exist" Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.070427 4747 scope.go:117] "RemoveContainer" containerID="91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8" Sep 30 19:05:17 crc kubenswrapper[4747]: E0930 19:05:17.070736 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8\": container with ID starting with 91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8 not found: ID does not exist" containerID="91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8" Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.070766 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8"} err="failed to get container status \"91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8\": rpc error: code = NotFound desc = could not find container \"91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8\": container with ID starting with 91873d912ee2434588b935e7d241886f3a99c6e358e1c80723e73498019d71b8 not found: ID does not exist" Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.106873 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="260df626-9266-4d41-ad27-630a5aa58cdf" path="/var/lib/kubelet/pods/260df626-9266-4d41-ad27-630a5aa58cdf/volumes" Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.308966 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.981688 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d6c0e23-33ea-4e36-bba3-63cdbc10089b","Type":"ContainerStarted","Data":"4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036"} Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.983864 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" event={"ID":"095a8eef-72de-4bd9-a435-c2b7ef2e832e","Type":"ContainerStarted","Data":"8692a22bd0d14ab0dd6873d5d22e345419a6b0ec7bdbc6de2f34d9b2874554b4"} Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.984027 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.988859 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3","Type":"ContainerStarted","Data":"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80"} Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.988957 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerName="cinder-api-log" containerID="cri-o://1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d" gracePeriod=30 Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.988997 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 19:05:17 crc kubenswrapper[4747]: I0930 19:05:17.989047 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerName="cinder-api" containerID="cri-o://e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80" gracePeriod=30 Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.008343 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" podStartSLOduration=3.008320208 podStartE2EDuration="3.008320208s" podCreationTimestamp="2025-09-30 19:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:18.00283944 +0000 UTC m=+1157.662319554" watchObservedRunningTime="2025-09-30 19:05:18.008320208 +0000 UTC m=+1157.667800322" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.022667 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.022649769 podStartE2EDuration="3.022649769s" podCreationTimestamp="2025-09-30 19:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:18.016228915 +0000 UTC m=+1157.675709029" watchObservedRunningTime="2025-09-30 19:05:18.022649769 +0000 UTC m=+1157.682129883" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.600948 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.792472 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data\") pod \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.792853 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vkkv\" (UniqueName: \"kubernetes.io/projected/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-kube-api-access-5vkkv\") pod \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.792976 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-logs\") pod \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.793074 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-scripts\") pod \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.793103 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-etc-machine-id\") pod \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.793160 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-combined-ca-bundle\") pod \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.793191 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data-custom\") pod \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\" (UID: \"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3\") " Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.794763 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" (UID: "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.795039 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-logs" (OuterVolumeSpecName: "logs") pod "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" (UID: "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.798790 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" (UID: "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.800014 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-scripts" (OuterVolumeSpecName: "scripts") pod "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" (UID: "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.804061 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-kube-api-access-5vkkv" (OuterVolumeSpecName: "kube-api-access-5vkkv") pod "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" (UID: "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3"). InnerVolumeSpecName "kube-api-access-5vkkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.820096 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" (UID: "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.876828 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data" (OuterVolumeSpecName: "config-data") pod "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" (UID: "2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.895741 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.895794 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.895816 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.895834 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.895850 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.895870 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vkkv\" (UniqueName: \"kubernetes.io/projected/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-kube-api-access-5vkkv\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:18 crc kubenswrapper[4747]: I0930 19:05:18.895889 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.002477 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d6c0e23-33ea-4e36-bba3-63cdbc10089b","Type":"ContainerStarted","Data":"d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3"} Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.006191 4747 generic.go:334] "Generic (PLEG): container finished" podID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerID="e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80" exitCode=0 Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.006217 4747 generic.go:334] "Generic (PLEG): container finished" podID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerID="1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d" exitCode=143 Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.006835 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.007097 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3","Type":"ContainerDied","Data":"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80"} Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.007168 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3","Type":"ContainerDied","Data":"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d"} Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.007186 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3","Type":"ContainerDied","Data":"f1c8ddedb20344066ca9cf9672132b34d67812f3c076f166961d90b64692434e"} Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.007214 4747 scope.go:117] "RemoveContainer" containerID="e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.030621 4747 scope.go:117] "RemoveContainer" containerID="1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.042509 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.174312694 podStartE2EDuration="4.042486798s" podCreationTimestamp="2025-09-30 19:05:15 +0000 UTC" firstStartedPulling="2025-09-30 19:05:16.070717692 +0000 UTC m=+1155.730197806" lastFinishedPulling="2025-09-30 19:05:16.938891786 +0000 UTC m=+1156.598371910" observedRunningTime="2025-09-30 19:05:19.028461755 +0000 UTC m=+1158.687941869" watchObservedRunningTime="2025-09-30 19:05:19.042486798 +0000 UTC m=+1158.701966912" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.053336 4747 scope.go:117] "RemoveContainer" containerID="e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.054067 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:19 crc kubenswrapper[4747]: E0930 19:05:19.054735 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80\": container with ID starting with e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80 not found: ID does not exist" containerID="e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.054795 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80"} err="failed to get container status \"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80\": rpc error: code = NotFound desc = could not find container \"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80\": container with ID starting with e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80 not found: ID does not exist" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.054832 4747 scope.go:117] "RemoveContainer" containerID="1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d" Sep 30 19:05:19 crc kubenswrapper[4747]: E0930 19:05:19.057940 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d\": container with ID starting with 1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d not found: ID does not exist" containerID="1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.057986 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d"} err="failed to get container status \"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d\": rpc error: code = NotFound desc = could not find container \"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d\": container with ID starting with 1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d not found: ID does not exist" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.058013 4747 scope.go:117] "RemoveContainer" containerID="e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.058305 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80"} err="failed to get container status \"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80\": rpc error: code = NotFound desc = could not find container \"e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80\": container with ID starting with e529488755c1dad4d9541f0a588d25304d978bc086bbf2448989c87cb7c90e80 not found: ID does not exist" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.058332 4747 scope.go:117] "RemoveContainer" containerID="1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.058569 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d"} err="failed to get container status \"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d\": rpc error: code = NotFound desc = could not find container \"1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d\": container with ID starting with 1b97e5163223e090c8426050e6380fb034d165b1349b82475de05ea9abbdab1d not found: ID does not exist" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.063158 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.081648 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:19 crc kubenswrapper[4747]: E0930 19:05:19.081969 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260df626-9266-4d41-ad27-630a5aa58cdf" containerName="dnsmasq-dns" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.081983 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="260df626-9266-4d41-ad27-630a5aa58cdf" containerName="dnsmasq-dns" Sep 30 19:05:19 crc kubenswrapper[4747]: E0930 19:05:19.081994 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerName="cinder-api" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.082000 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerName="cinder-api" Sep 30 19:05:19 crc kubenswrapper[4747]: E0930 19:05:19.082015 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260df626-9266-4d41-ad27-630a5aa58cdf" containerName="init" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.082021 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="260df626-9266-4d41-ad27-630a5aa58cdf" containerName="init" Sep 30 19:05:19 crc kubenswrapper[4747]: E0930 19:05:19.082047 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerName="cinder-api-log" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.082062 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerName="cinder-api-log" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.082200 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="260df626-9266-4d41-ad27-630a5aa58cdf" containerName="dnsmasq-dns" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.082213 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerName="cinder-api" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.082221 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" containerName="cinder-api-log" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.083466 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.095267 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.095522 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.097623 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.119945 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3" path="/var/lib/kubelet/pods/2fee1df4-d3f4-45a8-82a5-dfb6496bf3f3/volumes" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.122657 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.207603 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-config-data\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.207780 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrf7w\" (UniqueName: \"kubernetes.io/projected/13be932b-9552-4483-a16e-30c0564032b3-kube-api-access-zrf7w\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.207897 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.207993 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13be932b-9552-4483-a16e-30c0564032b3-logs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.208638 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.208686 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-scripts\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.208729 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.208755 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13be932b-9552-4483-a16e-30c0564032b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.208796 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.310693 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrf7w\" (UniqueName: \"kubernetes.io/projected/13be932b-9552-4483-a16e-30c0564032b3-kube-api-access-zrf7w\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.310777 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.310827 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13be932b-9552-4483-a16e-30c0564032b3-logs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.310976 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.311007 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-scripts\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.311046 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.311080 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13be932b-9552-4483-a16e-30c0564032b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.311121 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.311205 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-config-data\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.311732 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13be932b-9552-4483-a16e-30c0564032b3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.311906 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13be932b-9552-4483-a16e-30c0564032b3-logs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.316410 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-scripts\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.316753 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-config-data-custom\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.317647 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.317754 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.319916 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-config-data\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.325389 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13be932b-9552-4483-a16e-30c0564032b3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.331959 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrf7w\" (UniqueName: \"kubernetes.io/projected/13be932b-9552-4483-a16e-30c0564032b3-kube-api-access-zrf7w\") pod \"cinder-api-0\" (UID: \"13be932b-9552-4483-a16e-30c0564032b3\") " pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.406979 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:05:19 crc kubenswrapper[4747]: I0930 19:05:19.707183 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:05:20 crc kubenswrapper[4747]: I0930 19:05:20.014980 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13be932b-9552-4483-a16e-30c0564032b3","Type":"ContainerStarted","Data":"27ad3f0f36e41f338f69b183fe1d86d5f4a51999cc143ecb26d6672c6cd2acba"} Sep 30 19:05:20 crc kubenswrapper[4747]: I0930 19:05:20.565064 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 19:05:21 crc kubenswrapper[4747]: I0930 19:05:21.025842 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13be932b-9552-4483-a16e-30c0564032b3","Type":"ContainerStarted","Data":"a4ee63e228e7ab14ef41ed85b9c876191699b9d9687a4359e9979a1327b76978"} Sep 30 19:05:21 crc kubenswrapper[4747]: I0930 19:05:21.143042 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66b578897b-btxxw" Sep 30 19:05:21 crc kubenswrapper[4747]: I0930 19:05:21.158806 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66b578897b-btxxw" Sep 30 19:05:22 crc kubenswrapper[4747]: I0930 19:05:22.037243 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13be932b-9552-4483-a16e-30c0564032b3","Type":"ContainerStarted","Data":"462a65cd6e46f1e56c6c82d0f19a68d15421360a3e0ee967ae6e6c9f97c3d301"} Sep 30 19:05:22 crc kubenswrapper[4747]: I0930 19:05:22.063671 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.063654432 podStartE2EDuration="3.063654432s" podCreationTimestamp="2025-09-30 19:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:22.057067443 +0000 UTC m=+1161.716547597" watchObservedRunningTime="2025-09-30 19:05:22.063654432 +0000 UTC m=+1161.723134546" Sep 30 19:05:23 crc kubenswrapper[4747]: I0930 19:05:23.047654 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 19:05:23 crc kubenswrapper[4747]: I0930 19:05:23.593325 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-648477b6b5-24gx7" Sep 30 19:05:25 crc kubenswrapper[4747]: I0930 19:05:25.630118 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:05:25 crc kubenswrapper[4747]: I0930 19:05:25.720238 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-flzbs"] Sep 30 19:05:25 crc kubenswrapper[4747]: I0930 19:05:25.720591 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" podUID="585e57fe-da27-46d4-9b1d-d9242b17081c" containerName="dnsmasq-dns" containerID="cri-o://27dcf8ddf5ae46474c83614323b2e5c7b559306ca26994baf28acc96acbbb58c" gracePeriod=10 Sep 30 19:05:25 crc kubenswrapper[4747]: I0930 19:05:25.886024 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 19:05:25 crc kubenswrapper[4747]: I0930 19:05:25.918116 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.116745 4747 generic.go:334] "Generic (PLEG): container finished" podID="585e57fe-da27-46d4-9b1d-d9242b17081c" containerID="27dcf8ddf5ae46474c83614323b2e5c7b559306ca26994baf28acc96acbbb58c" exitCode=0 Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.116969 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerName="cinder-scheduler" containerID="cri-o://4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036" gracePeriod=30 Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.117236 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerName="probe" containerID="cri-o://d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3" gracePeriod=30 Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.117275 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" event={"ID":"585e57fe-da27-46d4-9b1d-d9242b17081c","Type":"ContainerDied","Data":"27dcf8ddf5ae46474c83614323b2e5c7b559306ca26994baf28acc96acbbb58c"} Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.201226 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.255557 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-nb\") pod \"585e57fe-da27-46d4-9b1d-d9242b17081c\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.255599 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-dns-svc\") pod \"585e57fe-da27-46d4-9b1d-d9242b17081c\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.255615 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-config\") pod \"585e57fe-da27-46d4-9b1d-d9242b17081c\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.255669 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-sb\") pod \"585e57fe-da27-46d4-9b1d-d9242b17081c\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.255756 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vmlc\" (UniqueName: \"kubernetes.io/projected/585e57fe-da27-46d4-9b1d-d9242b17081c-kube-api-access-5vmlc\") pod \"585e57fe-da27-46d4-9b1d-d9242b17081c\" (UID: \"585e57fe-da27-46d4-9b1d-d9242b17081c\") " Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.267962 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585e57fe-da27-46d4-9b1d-d9242b17081c-kube-api-access-5vmlc" (OuterVolumeSpecName: "kube-api-access-5vmlc") pod "585e57fe-da27-46d4-9b1d-d9242b17081c" (UID: "585e57fe-da27-46d4-9b1d-d9242b17081c"). InnerVolumeSpecName "kube-api-access-5vmlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.338198 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "585e57fe-da27-46d4-9b1d-d9242b17081c" (UID: "585e57fe-da27-46d4-9b1d-d9242b17081c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.340209 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "585e57fe-da27-46d4-9b1d-d9242b17081c" (UID: "585e57fe-da27-46d4-9b1d-d9242b17081c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.347367 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-config" (OuterVolumeSpecName: "config") pod "585e57fe-da27-46d4-9b1d-d9242b17081c" (UID: "585e57fe-da27-46d4-9b1d-d9242b17081c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.352279 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "585e57fe-da27-46d4-9b1d-d9242b17081c" (UID: "585e57fe-da27-46d4-9b1d-d9242b17081c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.357086 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.357109 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vmlc\" (UniqueName: \"kubernetes.io/projected/585e57fe-da27-46d4-9b1d-d9242b17081c-kube-api-access-5vmlc\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.357121 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.357130 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.357140 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/585e57fe-da27-46d4-9b1d-d9242b17081c-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.957184 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 19:05:26 crc kubenswrapper[4747]: E0930 19:05:26.957807 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585e57fe-da27-46d4-9b1d-d9242b17081c" containerName="dnsmasq-dns" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.957821 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="585e57fe-da27-46d4-9b1d-d9242b17081c" containerName="dnsmasq-dns" Sep 30 19:05:26 crc kubenswrapper[4747]: E0930 19:05:26.957835 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585e57fe-da27-46d4-9b1d-d9242b17081c" containerName="init" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.957841 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="585e57fe-da27-46d4-9b1d-d9242b17081c" containerName="init" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.958059 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="585e57fe-da27-46d4-9b1d-d9242b17081c" containerName="dnsmasq-dns" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.958677 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.961154 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gsdpq" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.961203 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.961601 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 19:05:26 crc kubenswrapper[4747]: I0930 19:05:26.977009 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.068675 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ad8842-6027-4f43-b6bf-82096e3c90a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.068749 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7c7q\" (UniqueName: \"kubernetes.io/projected/a3ad8842-6027-4f43-b6bf-82096e3c90a3-kube-api-access-t7c7q\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.068797 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3ad8842-6027-4f43-b6bf-82096e3c90a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.069052 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3ad8842-6027-4f43-b6bf-82096e3c90a3-openstack-config\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.126161 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" event={"ID":"585e57fe-da27-46d4-9b1d-d9242b17081c","Type":"ContainerDied","Data":"1aad4ff16ffbb3ef4e3543bc9b1f362c43f36616fc6547da005b67296b45d03d"} Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.126205 4747 scope.go:117] "RemoveContainer" containerID="27dcf8ddf5ae46474c83614323b2e5c7b559306ca26994baf28acc96acbbb58c" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.126303 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-flzbs" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.130922 4747 generic.go:334] "Generic (PLEG): container finished" podID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerID="d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3" exitCode=0 Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.130991 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d6c0e23-33ea-4e36-bba3-63cdbc10089b","Type":"ContainerDied","Data":"d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3"} Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.149632 4747 scope.go:117] "RemoveContainer" containerID="383fe629a07f286528ea512fa3ba5c08b182a8d388e10d26a5aabb548c9c4280" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.152339 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-flzbs"] Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.161286 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-flzbs"] Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.171071 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3ad8842-6027-4f43-b6bf-82096e3c90a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.171259 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3ad8842-6027-4f43-b6bf-82096e3c90a3-openstack-config\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.171350 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ad8842-6027-4f43-b6bf-82096e3c90a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.171517 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7c7q\" (UniqueName: \"kubernetes.io/projected/a3ad8842-6027-4f43-b6bf-82096e3c90a3-kube-api-access-t7c7q\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.173291 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a3ad8842-6027-4f43-b6bf-82096e3c90a3-openstack-config\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.178072 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a3ad8842-6027-4f43-b6bf-82096e3c90a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.179374 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ad8842-6027-4f43-b6bf-82096e3c90a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.190594 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7c7q\" (UniqueName: \"kubernetes.io/projected/a3ad8842-6027-4f43-b6bf-82096e3c90a3-kube-api-access-t7c7q\") pod \"openstackclient\" (UID: \"a3ad8842-6027-4f43-b6bf-82096e3c90a3\") " pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.288255 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 19:05:27 crc kubenswrapper[4747]: I0930 19:05:27.854064 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 19:05:28 crc kubenswrapper[4747]: I0930 19:05:28.146904 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a3ad8842-6027-4f43-b6bf-82096e3c90a3","Type":"ContainerStarted","Data":"de7defe4e88bdd4ee173ab6eacf417e8b21a6ca41811ed989c1a3e09ce53c413"} Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.081065 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.101426 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585e57fe-da27-46d4-9b1d-d9242b17081c" path="/var/lib/kubelet/pods/585e57fe-da27-46d4-9b1d-d9242b17081c/volumes" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.167947 4747 generic.go:334] "Generic (PLEG): container finished" podID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerID="4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036" exitCode=0 Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.168021 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d6c0e23-33ea-4e36-bba3-63cdbc10089b","Type":"ContainerDied","Data":"4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036"} Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.168089 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d6c0e23-33ea-4e36-bba3-63cdbc10089b","Type":"ContainerDied","Data":"7e504fbb5bed641a5fc6bda666aa3d5b53ac23a26b9b98a0813adb4d21aa8e26"} Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.168152 4747 scope.go:117] "RemoveContainer" containerID="d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.168286 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.209684 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-scripts\") pod \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.209745 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data-custom\") pod \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.209911 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-combined-ca-bundle\") pod \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.209955 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-etc-machine-id\") pod \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.209976 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data\") pod \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.210086 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4d6c0e23-33ea-4e36-bba3-63cdbc10089b" (UID: "4d6c0e23-33ea-4e36-bba3-63cdbc10089b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.210113 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vht2m\" (UniqueName: \"kubernetes.io/projected/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-kube-api-access-vht2m\") pod \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\" (UID: \"4d6c0e23-33ea-4e36-bba3-63cdbc10089b\") " Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.210477 4747 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.217083 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4d6c0e23-33ea-4e36-bba3-63cdbc10089b" (UID: "4d6c0e23-33ea-4e36-bba3-63cdbc10089b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.217334 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-scripts" (OuterVolumeSpecName: "scripts") pod "4d6c0e23-33ea-4e36-bba3-63cdbc10089b" (UID: "4d6c0e23-33ea-4e36-bba3-63cdbc10089b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.222290 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-kube-api-access-vht2m" (OuterVolumeSpecName: "kube-api-access-vht2m") pod "4d6c0e23-33ea-4e36-bba3-63cdbc10089b" (UID: "4d6c0e23-33ea-4e36-bba3-63cdbc10089b"). InnerVolumeSpecName "kube-api-access-vht2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.235143 4747 scope.go:117] "RemoveContainer" containerID="4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.267872 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d6c0e23-33ea-4e36-bba3-63cdbc10089b" (UID: "4d6c0e23-33ea-4e36-bba3-63cdbc10089b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.314075 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.314664 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vht2m\" (UniqueName: \"kubernetes.io/projected/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-kube-api-access-vht2m\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.314725 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.314782 4747 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.335324 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data" (OuterVolumeSpecName: "config-data") pod "4d6c0e23-33ea-4e36-bba3-63cdbc10089b" (UID: "4d6c0e23-33ea-4e36-bba3-63cdbc10089b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.359605 4747 scope.go:117] "RemoveContainer" containerID="d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3" Sep 30 19:05:29 crc kubenswrapper[4747]: E0930 19:05:29.360135 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3\": container with ID starting with d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3 not found: ID does not exist" containerID="d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.360180 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3"} err="failed to get container status \"d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3\": rpc error: code = NotFound desc = could not find container \"d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3\": container with ID starting with d3ed7a0f147803ae2c6ea5c1436f39f96abf33168b3c58a5e70f33cf997ea8d3 not found: ID does not exist" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.360210 4747 scope.go:117] "RemoveContainer" containerID="4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036" Sep 30 19:05:29 crc kubenswrapper[4747]: E0930 19:05:29.360699 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036\": container with ID starting with 4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036 not found: ID does not exist" containerID="4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.360737 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036"} err="failed to get container status \"4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036\": rpc error: code = NotFound desc = could not find container \"4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036\": container with ID starting with 4b7b2ed7453f7f237c092587eac9b6faea4b3b9d1a2daf1a07f8c4277554b036 not found: ID does not exist" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.416840 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6c0e23-33ea-4e36-bba3-63cdbc10089b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.506337 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.517800 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.531556 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:29 crc kubenswrapper[4747]: E0930 19:05:29.532809 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerName="probe" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.532829 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerName="probe" Sep 30 19:05:29 crc kubenswrapper[4747]: E0930 19:05:29.532852 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerName="cinder-scheduler" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.532861 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerName="cinder-scheduler" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.533034 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerName="cinder-scheduler" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.533052 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" containerName="probe" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.534856 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.538979 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.563001 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.620822 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6qcr\" (UniqueName: \"kubernetes.io/projected/24a9ac8d-d5be-42b1-95f3-9677a6f12434-kube-api-access-d6qcr\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.620870 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.620955 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24a9ac8d-d5be-42b1-95f3-9677a6f12434-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.621005 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-config-data\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.621040 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.621260 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-scripts\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.722573 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.722630 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-scripts\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.722664 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6qcr\" (UniqueName: \"kubernetes.io/projected/24a9ac8d-d5be-42b1-95f3-9677a6f12434-kube-api-access-d6qcr\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.722686 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.722733 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24a9ac8d-d5be-42b1-95f3-9677a6f12434-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.722780 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-config-data\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.724736 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24a9ac8d-d5be-42b1-95f3-9677a6f12434-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.726956 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.729144 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-config-data\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.730626 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.731669 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a9ac8d-d5be-42b1-95f3-9677a6f12434-scripts\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.744123 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6qcr\" (UniqueName: \"kubernetes.io/projected/24a9ac8d-d5be-42b1-95f3-9677a6f12434-kube-api-access-d6qcr\") pod \"cinder-scheduler-0\" (UID: \"24a9ac8d-d5be-42b1-95f3-9677a6f12434\") " pod="openstack/cinder-scheduler-0" Sep 30 19:05:29 crc kubenswrapper[4747]: I0930 19:05:29.869559 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:05:30 crc kubenswrapper[4747]: I0930 19:05:30.398791 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:05:31 crc kubenswrapper[4747]: I0930 19:05:31.101502 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d6c0e23-33ea-4e36-bba3-63cdbc10089b" path="/var/lib/kubelet/pods/4d6c0e23-33ea-4e36-bba3-63cdbc10089b/volumes" Sep 30 19:05:31 crc kubenswrapper[4747]: I0930 19:05:31.209499 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24a9ac8d-d5be-42b1-95f3-9677a6f12434","Type":"ContainerStarted","Data":"572a27e94bc8d52f7b3b9c8372cc0f9c426fa7fa58df5ef09580cc446dbe1d52"} Sep 30 19:05:31 crc kubenswrapper[4747]: I0930 19:05:31.209539 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24a9ac8d-d5be-42b1-95f3-9677a6f12434","Type":"ContainerStarted","Data":"997378399a00c86797e8d45b55c52bb0316b5f39ec6e99930578047133269b0c"} Sep 30 19:05:31 crc kubenswrapper[4747]: I0930 19:05:31.451229 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.235420 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24a9ac8d-d5be-42b1-95f3-9677a6f12434","Type":"ContainerStarted","Data":"307fa39c2a00b102c410c2cdb05131ab4911ef496b1f1a20b310d40acbf126ec"} Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.256002 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.255983093 podStartE2EDuration="3.255983093s" podCreationTimestamp="2025-09-30 19:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:32.254687716 +0000 UTC m=+1171.914167830" watchObservedRunningTime="2025-09-30 19:05:32.255983093 +0000 UTC m=+1171.915463207" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.538418 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-8lhgs"] Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.539544 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8lhgs" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.549405 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8lhgs"] Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.575374 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnhv\" (UniqueName: \"kubernetes.io/projected/08a20f42-1c6e-40b6-b893-c77dd9f79b01-kube-api-access-8fnhv\") pod \"nova-api-db-create-8lhgs\" (UID: \"08a20f42-1c6e-40b6-b893-c77dd9f79b01\") " pod="openstack/nova-api-db-create-8lhgs" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.631012 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pv4ms"] Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.632023 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pv4ms" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.645471 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pv4ms"] Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.677438 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdtcl\" (UniqueName: \"kubernetes.io/projected/2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf-kube-api-access-wdtcl\") pod \"nova-cell0-db-create-pv4ms\" (UID: \"2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf\") " pod="openstack/nova-cell0-db-create-pv4ms" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.677733 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fnhv\" (UniqueName: \"kubernetes.io/projected/08a20f42-1c6e-40b6-b893-c77dd9f79b01-kube-api-access-8fnhv\") pod \"nova-api-db-create-8lhgs\" (UID: \"08a20f42-1c6e-40b6-b893-c77dd9f79b01\") " pod="openstack/nova-api-db-create-8lhgs" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.694576 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fnhv\" (UniqueName: \"kubernetes.io/projected/08a20f42-1c6e-40b6-b893-c77dd9f79b01-kube-api-access-8fnhv\") pod \"nova-api-db-create-8lhgs\" (UID: \"08a20f42-1c6e-40b6-b893-c77dd9f79b01\") " pod="openstack/nova-api-db-create-8lhgs" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.739328 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ssjx4"] Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.744628 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ssjx4" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.750423 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ssjx4"] Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.781664 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9lf\" (UniqueName: \"kubernetes.io/projected/f10981c2-a487-449d-85d6-d6cdf33815b5-kube-api-access-kj9lf\") pod \"nova-cell1-db-create-ssjx4\" (UID: \"f10981c2-a487-449d-85d6-d6cdf33815b5\") " pod="openstack/nova-cell1-db-create-ssjx4" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.781780 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdtcl\" (UniqueName: \"kubernetes.io/projected/2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf-kube-api-access-wdtcl\") pod \"nova-cell0-db-create-pv4ms\" (UID: \"2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf\") " pod="openstack/nova-cell0-db-create-pv4ms" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.799497 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdtcl\" (UniqueName: \"kubernetes.io/projected/2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf-kube-api-access-wdtcl\") pod \"nova-cell0-db-create-pv4ms\" (UID: \"2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf\") " pod="openstack/nova-cell0-db-create-pv4ms" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.855685 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8lhgs" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.884332 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9lf\" (UniqueName: \"kubernetes.io/projected/f10981c2-a487-449d-85d6-d6cdf33815b5-kube-api-access-kj9lf\") pod \"nova-cell1-db-create-ssjx4\" (UID: \"f10981c2-a487-449d-85d6-d6cdf33815b5\") " pod="openstack/nova-cell1-db-create-ssjx4" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.902124 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9lf\" (UniqueName: \"kubernetes.io/projected/f10981c2-a487-449d-85d6-d6cdf33815b5-kube-api-access-kj9lf\") pod \"nova-cell1-db-create-ssjx4\" (UID: \"f10981c2-a487-449d-85d6-d6cdf33815b5\") " pod="openstack/nova-cell1-db-create-ssjx4" Sep 30 19:05:32 crc kubenswrapper[4747]: I0930 19:05:32.946612 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pv4ms" Sep 30 19:05:33 crc kubenswrapper[4747]: I0930 19:05:33.072019 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ssjx4" Sep 30 19:05:33 crc kubenswrapper[4747]: I0930 19:05:33.323206 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8lhgs"] Sep 30 19:05:33 crc kubenswrapper[4747]: W0930 19:05:33.332962 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a20f42_1c6e_40b6_b893_c77dd9f79b01.slice/crio-2035d51d4d5ac13c6c34ad4039dd4351e1b9d812c7f0400939910d2d16f6c980 WatchSource:0}: Error finding container 2035d51d4d5ac13c6c34ad4039dd4351e1b9d812c7f0400939910d2d16f6c980: Status 404 returned error can't find the container with id 2035d51d4d5ac13c6c34ad4039dd4351e1b9d812c7f0400939910d2d16f6c980 Sep 30 19:05:33 crc kubenswrapper[4747]: I0930 19:05:33.430853 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pv4ms"] Sep 30 19:05:33 crc kubenswrapper[4747]: W0930 19:05:33.446219 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b022cfe_5c7b_401f_bc2e_6bee89cd5cdf.slice/crio-ceeb266411b9d0dc92de7be58ba07d6287473187b9654e70b1a977fd3202ce2c WatchSource:0}: Error finding container ceeb266411b9d0dc92de7be58ba07d6287473187b9654e70b1a977fd3202ce2c: Status 404 returned error can't find the container with id ceeb266411b9d0dc92de7be58ba07d6287473187b9654e70b1a977fd3202ce2c Sep 30 19:05:33 crc kubenswrapper[4747]: I0930 19:05:33.530182 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ssjx4"] Sep 30 19:05:33 crc kubenswrapper[4747]: W0930 19:05:33.534367 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10981c2_a487_449d_85d6_d6cdf33815b5.slice/crio-e3454dba5485525a498f44f3ef92def3aac9390c4d9391acbcea3feb6605f3b9 WatchSource:0}: Error finding container e3454dba5485525a498f44f3ef92def3aac9390c4d9391acbcea3feb6605f3b9: Status 404 returned error can't find the container with id e3454dba5485525a498f44f3ef92def3aac9390c4d9391acbcea3feb6605f3b9 Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.259197 4747 generic.go:334] "Generic (PLEG): container finished" podID="08a20f42-1c6e-40b6-b893-c77dd9f79b01" containerID="4ce6f38185f13f9a9245656a867548db218454ea98e6734786ffe5229d82fe3b" exitCode=0 Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.259238 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8lhgs" event={"ID":"08a20f42-1c6e-40b6-b893-c77dd9f79b01","Type":"ContainerDied","Data":"4ce6f38185f13f9a9245656a867548db218454ea98e6734786ffe5229d82fe3b"} Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.259489 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8lhgs" event={"ID":"08a20f42-1c6e-40b6-b893-c77dd9f79b01","Type":"ContainerStarted","Data":"2035d51d4d5ac13c6c34ad4039dd4351e1b9d812c7f0400939910d2d16f6c980"} Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.261836 4747 generic.go:334] "Generic (PLEG): container finished" podID="f10981c2-a487-449d-85d6-d6cdf33815b5" containerID="47931ef5852e5d81bcdac51304cc3f06675c644c43b07d4b41032fa55314edcb" exitCode=0 Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.261919 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ssjx4" event={"ID":"f10981c2-a487-449d-85d6-d6cdf33815b5","Type":"ContainerDied","Data":"47931ef5852e5d81bcdac51304cc3f06675c644c43b07d4b41032fa55314edcb"} Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.261969 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ssjx4" event={"ID":"f10981c2-a487-449d-85d6-d6cdf33815b5","Type":"ContainerStarted","Data":"e3454dba5485525a498f44f3ef92def3aac9390c4d9391acbcea3feb6605f3b9"} Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.263861 4747 generic.go:334] "Generic (PLEG): container finished" podID="2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf" containerID="4402083650e1f6bdbf9d9ea20c7e1e94ee24e2e60621d16fdc7db4b23c094785" exitCode=0 Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.263911 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pv4ms" event={"ID":"2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf","Type":"ContainerDied","Data":"4402083650e1f6bdbf9d9ea20c7e1e94ee24e2e60621d16fdc7db4b23c094785"} Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.263992 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pv4ms" event={"ID":"2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf","Type":"ContainerStarted","Data":"ceeb266411b9d0dc92de7be58ba07d6287473187b9654e70b1a977fd3202ce2c"} Sep 30 19:05:34 crc kubenswrapper[4747]: I0930 19:05:34.870485 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 19:05:38 crc kubenswrapper[4747]: I0930 19:05:38.043700 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:05:38 crc kubenswrapper[4747]: I0930 19:05:38.045172 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerName="glance-log" containerID="cri-o://919d6e650959b54fe7556e9a78b03101837d97ef2d2ca35ac0183d02f039361b" gracePeriod=30 Sep 30 19:05:38 crc kubenswrapper[4747]: I0930 19:05:38.045267 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerName="glance-httpd" containerID="cri-o://c990a0cb5652bc98b3d7e30e8f51aa2beb1461811b7233bb7f25a8635cfb91fd" gracePeriod=30 Sep 30 19:05:38 crc kubenswrapper[4747]: I0930 19:05:38.316281 4747 generic.go:334] "Generic (PLEG): container finished" podID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerID="919d6e650959b54fe7556e9a78b03101837d97ef2d2ca35ac0183d02f039361b" exitCode=143 Sep 30 19:05:38 crc kubenswrapper[4747]: I0930 19:05:38.316328 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4","Type":"ContainerDied","Data":"919d6e650959b54fe7556e9a78b03101837d97ef2d2ca35ac0183d02f039361b"} Sep 30 19:05:38 crc kubenswrapper[4747]: I0930 19:05:38.825785 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:05:38 crc kubenswrapper[4747]: I0930 19:05:38.826087 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerName="glance-log" containerID="cri-o://2ae2c51de35c0de7257e4e9a5f833a1878d488774559ad44b39c31c302e2937a" gracePeriod=30 Sep 30 19:05:38 crc kubenswrapper[4747]: I0930 19:05:38.826186 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerName="glance-httpd" containerID="cri-o://d596236814f9eaa207e959f914461e79e4719a27bdd84f9494f3fa02a3c9e822" gracePeriod=30 Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.329300 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8lhgs" event={"ID":"08a20f42-1c6e-40b6-b893-c77dd9f79b01","Type":"ContainerDied","Data":"2035d51d4d5ac13c6c34ad4039dd4351e1b9d812c7f0400939910d2d16f6c980"} Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.329366 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2035d51d4d5ac13c6c34ad4039dd4351e1b9d812c7f0400939910d2d16f6c980" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.331990 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ssjx4" event={"ID":"f10981c2-a487-449d-85d6-d6cdf33815b5","Type":"ContainerDied","Data":"e3454dba5485525a498f44f3ef92def3aac9390c4d9391acbcea3feb6605f3b9"} Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.332149 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3454dba5485525a498f44f3ef92def3aac9390c4d9391acbcea3feb6605f3b9" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.334314 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pv4ms" event={"ID":"2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf","Type":"ContainerDied","Data":"ceeb266411b9d0dc92de7be58ba07d6287473187b9654e70b1a977fd3202ce2c"} Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.334352 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceeb266411b9d0dc92de7be58ba07d6287473187b9654e70b1a977fd3202ce2c" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.337399 4747 generic.go:334] "Generic (PLEG): container finished" podID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerID="2ae2c51de35c0de7257e4e9a5f833a1878d488774559ad44b39c31c302e2937a" exitCode=143 Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.337443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0","Type":"ContainerDied","Data":"2ae2c51de35c0de7257e4e9a5f833a1878d488774559ad44b39c31c302e2937a"} Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.482248 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ssjx4" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.523645 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pv4ms" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.531162 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj9lf\" (UniqueName: \"kubernetes.io/projected/f10981c2-a487-449d-85d6-d6cdf33815b5-kube-api-access-kj9lf\") pod \"f10981c2-a487-449d-85d6-d6cdf33815b5\" (UID: \"f10981c2-a487-449d-85d6-d6cdf33815b5\") " Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.542291 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10981c2-a487-449d-85d6-d6cdf33815b5-kube-api-access-kj9lf" (OuterVolumeSpecName: "kube-api-access-kj9lf") pod "f10981c2-a487-449d-85d6-d6cdf33815b5" (UID: "f10981c2-a487-449d-85d6-d6cdf33815b5"). InnerVolumeSpecName "kube-api-access-kj9lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.546644 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8lhgs" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.632288 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fnhv\" (UniqueName: \"kubernetes.io/projected/08a20f42-1c6e-40b6-b893-c77dd9f79b01-kube-api-access-8fnhv\") pod \"08a20f42-1c6e-40b6-b893-c77dd9f79b01\" (UID: \"08a20f42-1c6e-40b6-b893-c77dd9f79b01\") " Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.632355 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdtcl\" (UniqueName: \"kubernetes.io/projected/2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf-kube-api-access-wdtcl\") pod \"2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf\" (UID: \"2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf\") " Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.632657 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj9lf\" (UniqueName: \"kubernetes.io/projected/f10981c2-a487-449d-85d6-d6cdf33815b5-kube-api-access-kj9lf\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.636054 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf-kube-api-access-wdtcl" (OuterVolumeSpecName: "kube-api-access-wdtcl") pod "2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf" (UID: "2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf"). InnerVolumeSpecName "kube-api-access-wdtcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.636422 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a20f42-1c6e-40b6-b893-c77dd9f79b01-kube-api-access-8fnhv" (OuterVolumeSpecName: "kube-api-access-8fnhv") pod "08a20f42-1c6e-40b6-b893-c77dd9f79b01" (UID: "08a20f42-1c6e-40b6-b893-c77dd9f79b01"). InnerVolumeSpecName "kube-api-access-8fnhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.734552 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fnhv\" (UniqueName: \"kubernetes.io/projected/08a20f42-1c6e-40b6-b893-c77dd9f79b01-kube-api-access-8fnhv\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:39 crc kubenswrapper[4747]: I0930 19:05:39.734600 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdtcl\" (UniqueName: \"kubernetes.io/projected/2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf-kube-api-access-wdtcl\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:40 crc kubenswrapper[4747]: I0930 19:05:40.064503 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 19:05:40 crc kubenswrapper[4747]: I0930 19:05:40.348077 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pv4ms" Sep 30 19:05:40 crc kubenswrapper[4747]: I0930 19:05:40.348137 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8lhgs" Sep 30 19:05:40 crc kubenswrapper[4747]: I0930 19:05:40.348066 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a3ad8842-6027-4f43-b6bf-82096e3c90a3","Type":"ContainerStarted","Data":"5b4d03a31cd3ad8fb110d750153e69ace8250c4d32bcdc562306e3d51660326d"} Sep 30 19:05:40 crc kubenswrapper[4747]: I0930 19:05:40.348291 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ssjx4" Sep 30 19:05:40 crc kubenswrapper[4747]: I0930 19:05:40.374447 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.867138474 podStartE2EDuration="14.374428795s" podCreationTimestamp="2025-09-30 19:05:26 +0000 UTC" firstStartedPulling="2025-09-30 19:05:27.860514255 +0000 UTC m=+1167.519994409" lastFinishedPulling="2025-09-30 19:05:39.367804586 +0000 UTC m=+1179.027284730" observedRunningTime="2025-09-30 19:05:40.365334024 +0000 UTC m=+1180.024814138" watchObservedRunningTime="2025-09-30 19:05:40.374428795 +0000 UTC m=+1180.033908909" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.360278 4747 generic.go:334] "Generic (PLEG): container finished" podID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerID="c990a0cb5652bc98b3d7e30e8f51aa2beb1461811b7233bb7f25a8635cfb91fd" exitCode=0 Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.360341 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4","Type":"ContainerDied","Data":"c990a0cb5652bc98b3d7e30e8f51aa2beb1461811b7233bb7f25a8635cfb91fd"} Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.502592 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.790678 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.885442 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-logs\") pod \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.885531 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-httpd-run\") pod \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.885614 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-combined-ca-bundle\") pod \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.885645 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-scripts\") pod \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.885666 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-public-tls-certs\") pod \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.885694 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4g98\" (UniqueName: \"kubernetes.io/projected/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-kube-api-access-d4g98\") pod \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.885770 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.885793 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-config-data\") pod \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\" (UID: \"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4\") " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.887074 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" (UID: "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.887320 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-logs" (OuterVolumeSpecName: "logs") pod "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" (UID: "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.908120 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" (UID: "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.910113 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-kube-api-access-d4g98" (OuterVolumeSpecName: "kube-api-access-d4g98") pod "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" (UID: "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4"). InnerVolumeSpecName "kube-api-access-d4g98". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.932096 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-scripts" (OuterVolumeSpecName: "scripts") pod "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" (UID: "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.990910 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.990966 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.990975 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.990983 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:41 crc kubenswrapper[4747]: I0930 19:05:41.990993 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4g98\" (UniqueName: \"kubernetes.io/projected/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-kube-api-access-d4g98\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.013072 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" (UID: "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.041501 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" (UID: "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.062415 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.065072 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-config-data" (OuterVolumeSpecName: "config-data") pod "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" (UID: "688a1ce6-0861-4c45-9e0c-a94c65a5c5d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.092264 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.092401 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.092482 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.092578 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.373239 4747 generic.go:334] "Generic (PLEG): container finished" podID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerID="d596236814f9eaa207e959f914461e79e4719a27bdd84f9494f3fa02a3c9e822" exitCode=0 Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.373495 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0","Type":"ContainerDied","Data":"d596236814f9eaa207e959f914461e79e4719a27bdd84f9494f3fa02a3c9e822"} Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.375709 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"688a1ce6-0861-4c45-9e0c-a94c65a5c5d4","Type":"ContainerDied","Data":"54fd27b3c778951e3f2357c2908e6905f603f7ebb82b83b9147133fe14b98388"} Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.375784 4747 scope.go:117] "RemoveContainer" containerID="c990a0cb5652bc98b3d7e30e8f51aa2beb1461811b7233bb7f25a8635cfb91fd" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.375917 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.420291 4747 scope.go:117] "RemoveContainer" containerID="919d6e650959b54fe7556e9a78b03101837d97ef2d2ca35ac0183d02f039361b" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.438335 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.471970 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481158 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:05:42 crc kubenswrapper[4747]: E0930 19:05:42.481579 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10981c2-a487-449d-85d6-d6cdf33815b5" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481596 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10981c2-a487-449d-85d6-d6cdf33815b5" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: E0930 19:05:42.481614 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481622 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: E0930 19:05:42.481640 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a20f42-1c6e-40b6-b893-c77dd9f79b01" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481651 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a20f42-1c6e-40b6-b893-c77dd9f79b01" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: E0930 19:05:42.481658 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerName="glance-httpd" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481664 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerName="glance-httpd" Sep 30 19:05:42 crc kubenswrapper[4747]: E0930 19:05:42.481685 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerName="glance-log" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481692 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerName="glance-log" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481864 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481883 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10981c2-a487-449d-85d6-d6cdf33815b5" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481897 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a20f42-1c6e-40b6-b893-c77dd9f79b01" containerName="mariadb-database-create" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481909 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerName="glance-httpd" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.481919 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" containerName="glance-log" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.482808 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.484703 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.487849 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.490106 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.605019 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.605066 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.605125 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6393639-cf22-49a1-96e1-f20d11a72791-logs\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.605146 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcc2x\" (UniqueName: \"kubernetes.io/projected/a6393639-cf22-49a1-96e1-f20d11a72791-kube-api-access-rcc2x\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.605183 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.605212 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.605247 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.605273 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6393639-cf22-49a1-96e1-f20d11a72791-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.625568 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706468 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706575 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6393639-cf22-49a1-96e1-f20d11a72791-logs\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706609 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcc2x\" (UniqueName: \"kubernetes.io/projected/a6393639-cf22-49a1-96e1-f20d11a72791-kube-api-access-rcc2x\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706644 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706709 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706758 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706794 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6393639-cf22-49a1-96e1-f20d11a72791-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706833 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.706971 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.707699 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6393639-cf22-49a1-96e1-f20d11a72791-logs\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.707977 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6393639-cf22-49a1-96e1-f20d11a72791-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.713700 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.714590 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.715244 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.722349 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6393639-cf22-49a1-96e1-f20d11a72791-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.723593 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcc2x\" (UniqueName: \"kubernetes.io/projected/a6393639-cf22-49a1-96e1-f20d11a72791-kube-api-access-rcc2x\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.756493 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a6393639-cf22-49a1-96e1-f20d11a72791\") " pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.776368 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-acf1-account-create-bq4hx"] Sep 30 19:05:42 crc kubenswrapper[4747]: E0930 19:05:42.778659 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerName="glance-httpd" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.778685 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerName="glance-httpd" Sep 30 19:05:42 crc kubenswrapper[4747]: E0930 19:05:42.778719 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerName="glance-log" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.778726 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerName="glance-log" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.780364 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerName="glance-httpd" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.780430 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" containerName="glance-log" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.781746 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-acf1-account-create-bq4hx" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.788585 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.805527 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.811549 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-internal-tls-certs\") pod \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.811648 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-scripts\") pod \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.811742 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-httpd-run\") pod \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.811768 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-combined-ca-bundle\") pod \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.811809 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.812287 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-logs\") pod \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.812362 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd6fz\" (UniqueName: \"kubernetes.io/projected/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-kube-api-access-nd6fz\") pod \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.812396 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-config-data\") pod \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\" (UID: \"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0\") " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.813457 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-logs" (OuterVolumeSpecName: "logs") pod "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" (UID: "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.815842 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" (UID: "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.817744 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-scripts" (OuterVolumeSpecName: "scripts") pod "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" (UID: "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.817886 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-kube-api-access-nd6fz" (OuterVolumeSpecName: "kube-api-access-nd6fz") pod "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" (UID: "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0"). InnerVolumeSpecName "kube-api-access-nd6fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.826660 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" (UID: "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.831313 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-acf1-account-create-bq4hx"] Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.845882 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" (UID: "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.862244 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-config-data" (OuterVolumeSpecName: "config-data") pod "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" (UID: "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.879378 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" (UID: "9f82abbf-1e09-4f46-97c6-6076a5c3a7e0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914235 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7m4s\" (UniqueName: \"kubernetes.io/projected/4be78d88-d62c-4ac0-a07c-0e3e323232c0-kube-api-access-f7m4s\") pod \"nova-api-acf1-account-create-bq4hx\" (UID: \"4be78d88-d62c-4ac0-a07c-0e3e323232c0\") " pod="openstack/nova-api-acf1-account-create-bq4hx" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914358 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914371 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914380 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914389 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914407 4747 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914417 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914424 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd6fz\" (UniqueName: \"kubernetes.io/projected/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-kube-api-access-nd6fz\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.914433 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.930343 4747 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.962458 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5920-account-create-8htm8"] Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.964542 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5920-account-create-8htm8" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.966445 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 19:05:42 crc kubenswrapper[4747]: I0930 19:05:42.977967 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5920-account-create-8htm8"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.015840 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7m4s\" (UniqueName: \"kubernetes.io/projected/4be78d88-d62c-4ac0-a07c-0e3e323232c0-kube-api-access-f7m4s\") pod \"nova-api-acf1-account-create-bq4hx\" (UID: \"4be78d88-d62c-4ac0-a07c-0e3e323232c0\") " pod="openstack/nova-api-acf1-account-create-bq4hx" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.015962 4747 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.035835 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7m4s\" (UniqueName: \"kubernetes.io/projected/4be78d88-d62c-4ac0-a07c-0e3e323232c0-kube-api-access-f7m4s\") pod \"nova-api-acf1-account-create-bq4hx\" (UID: \"4be78d88-d62c-4ac0-a07c-0e3e323232c0\") " pod="openstack/nova-api-acf1-account-create-bq4hx" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.095520 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688a1ce6-0861-4c45-9e0c-a94c65a5c5d4" path="/var/lib/kubelet/pods/688a1ce6-0861-4c45-9e0c-a94c65a5c5d4/volumes" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.114690 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-acf1-account-create-bq4hx" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.117120 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46lf8\" (UniqueName: \"kubernetes.io/projected/69f7e6bc-f06d-475a-8cb9-778ad0105c07-kube-api-access-46lf8\") pod \"nova-cell0-5920-account-create-8htm8\" (UID: \"69f7e6bc-f06d-475a-8cb9-778ad0105c07\") " pod="openstack/nova-cell0-5920-account-create-8htm8" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.128006 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b850-account-create-r6fql"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.129064 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b850-account-create-r6fql" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.132768 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.148158 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b850-account-create-r6fql"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.219474 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46lf8\" (UniqueName: \"kubernetes.io/projected/69f7e6bc-f06d-475a-8cb9-778ad0105c07-kube-api-access-46lf8\") pod \"nova-cell0-5920-account-create-8htm8\" (UID: \"69f7e6bc-f06d-475a-8cb9-778ad0105c07\") " pod="openstack/nova-cell0-5920-account-create-8htm8" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.245517 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46lf8\" (UniqueName: \"kubernetes.io/projected/69f7e6bc-f06d-475a-8cb9-778ad0105c07-kube-api-access-46lf8\") pod \"nova-cell0-5920-account-create-8htm8\" (UID: \"69f7e6bc-f06d-475a-8cb9-778ad0105c07\") " pod="openstack/nova-cell0-5920-account-create-8htm8" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.281609 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5920-account-create-8htm8" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.320914 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw5sn\" (UniqueName: \"kubernetes.io/projected/b8273640-d8ff-4820-8762-fe35091c22ff-kube-api-access-xw5sn\") pod \"nova-cell1-b850-account-create-r6fql\" (UID: \"b8273640-d8ff-4820-8762-fe35091c22ff\") " pod="openstack/nova-cell1-b850-account-create-r6fql" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.400639 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.407324 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f82abbf-1e09-4f46-97c6-6076a5c3a7e0","Type":"ContainerDied","Data":"438f36f95159ee84847dafd88eff358994ae9c882dbb53014522558179907668"} Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.407365 4747 scope.go:117] "RemoveContainer" containerID="d596236814f9eaa207e959f914461e79e4719a27bdd84f9494f3fa02a3c9e822" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.407490 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.422503 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw5sn\" (UniqueName: \"kubernetes.io/projected/b8273640-d8ff-4820-8762-fe35091c22ff-kube-api-access-xw5sn\") pod \"nova-cell1-b850-account-create-r6fql\" (UID: \"b8273640-d8ff-4820-8762-fe35091c22ff\") " pod="openstack/nova-cell1-b850-account-create-r6fql" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.453806 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw5sn\" (UniqueName: \"kubernetes.io/projected/b8273640-d8ff-4820-8762-fe35091c22ff-kube-api-access-xw5sn\") pod \"nova-cell1-b850-account-create-r6fql\" (UID: \"b8273640-d8ff-4820-8762-fe35091c22ff\") " pod="openstack/nova-cell1-b850-account-create-r6fql" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.459635 4747 scope.go:117] "RemoveContainer" containerID="2ae2c51de35c0de7257e4e9a5f833a1878d488774559ad44b39c31c302e2937a" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.484501 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.496212 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5677cbcc7-67jtn" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.502147 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.519231 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.521437 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.525170 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.525415 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.527160 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:05:43 crc kubenswrapper[4747]: W0930 19:05:43.561527 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f7e6bc_f06d_475a_8cb9_778ad0105c07.slice/crio-6385f0ce04a7f0b7ee593063527654ee2cddfad434855d6b5d7954c2e7e51aea WatchSource:0}: Error finding container 6385f0ce04a7f0b7ee593063527654ee2cddfad434855d6b5d7954c2e7e51aea: Status 404 returned error can't find the container with id 6385f0ce04a7f0b7ee593063527654ee2cddfad434855d6b5d7954c2e7e51aea Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.573654 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db8dcfd56-rp5sl"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.573842 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-db8dcfd56-rp5sl" podUID="94976787-0fa3-4e10-9902-1918b9512f30" containerName="neutron-api" containerID="cri-o://5ae68ca2574ead6cd58662940d59695ac06f879053d563bebf9d30a87e675907" gracePeriod=30 Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.573942 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-db8dcfd56-rp5sl" podUID="94976787-0fa3-4e10-9902-1918b9512f30" containerName="neutron-httpd" containerID="cri-o://ff516d0918a894dbef5bd7e6c1cc4a8bdcfe9e38ed9a3aa7c4c55355b6dcdb0b" gracePeriod=30 Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.597000 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5920-account-create-8htm8"] Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.611196 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-acf1-account-create-bq4hx"] Sep 30 19:05:43 crc kubenswrapper[4747]: W0930 19:05:43.613084 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4be78d88_d62c_4ac0_a07c_0e3e323232c0.slice/crio-4e87a3f1229459ea084f2f6dda784b968abb86973dc43c8743e7d52a9c480bdd WatchSource:0}: Error finding container 4e87a3f1229459ea084f2f6dda784b968abb86973dc43c8743e7d52a9c480bdd: Status 404 returned error can't find the container with id 4e87a3f1229459ea084f2f6dda784b968abb86973dc43c8743e7d52a9c480bdd Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.625381 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.625413 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.625484 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c4lg\" (UniqueName: \"kubernetes.io/projected/10f77684-656c-4043-b9b4-8e6bfdca1621-kube-api-access-4c4lg\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.625509 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.625534 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10f77684-656c-4043-b9b4-8e6bfdca1621-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.625552 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.625600 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.625627 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10f77684-656c-4043-b9b4-8e6bfdca1621-logs\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.730507 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.730548 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10f77684-656c-4043-b9b4-8e6bfdca1621-logs\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.730615 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.730634 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.730681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c4lg\" (UniqueName: \"kubernetes.io/projected/10f77684-656c-4043-b9b4-8e6bfdca1621-kube-api-access-4c4lg\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.730701 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.730724 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10f77684-656c-4043-b9b4-8e6bfdca1621-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.730740 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.732309 4747 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.732607 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10f77684-656c-4043-b9b4-8e6bfdca1621-logs\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.733412 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10f77684-656c-4043-b9b4-8e6bfdca1621-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.737436 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.744096 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.745508 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.747028 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b850-account-create-r6fql" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.748080 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10f77684-656c-4043-b9b4-8e6bfdca1621-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.752464 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c4lg\" (UniqueName: \"kubernetes.io/projected/10f77684-656c-4043-b9b4-8e6bfdca1621-kube-api-access-4c4lg\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.799766 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"10f77684-656c-4043-b9b4-8e6bfdca1621\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:05:43 crc kubenswrapper[4747]: I0930 19:05:43.881088 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.225472 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b850-account-create-r6fql"] Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.418267 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.452236 4747 generic.go:334] "Generic (PLEG): container finished" podID="4be78d88-d62c-4ac0-a07c-0e3e323232c0" containerID="41cbfdef25dcd41346009cf5b18826b89241b94a58e75eefa0cb73a22366c021" exitCode=0 Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.452379 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-acf1-account-create-bq4hx" event={"ID":"4be78d88-d62c-4ac0-a07c-0e3e323232c0","Type":"ContainerDied","Data":"41cbfdef25dcd41346009cf5b18826b89241b94a58e75eefa0cb73a22366c021"} Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.452405 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-acf1-account-create-bq4hx" event={"ID":"4be78d88-d62c-4ac0-a07c-0e3e323232c0","Type":"ContainerStarted","Data":"4e87a3f1229459ea084f2f6dda784b968abb86973dc43c8743e7d52a9c480bdd"} Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.459868 4747 generic.go:334] "Generic (PLEG): container finished" podID="94976787-0fa3-4e10-9902-1918b9512f30" containerID="ff516d0918a894dbef5bd7e6c1cc4a8bdcfe9e38ed9a3aa7c4c55355b6dcdb0b" exitCode=0 Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.459961 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db8dcfd56-rp5sl" event={"ID":"94976787-0fa3-4e10-9902-1918b9512f30","Type":"ContainerDied","Data":"ff516d0918a894dbef5bd7e6c1cc4a8bdcfe9e38ed9a3aa7c4c55355b6dcdb0b"} Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.468549 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6393639-cf22-49a1-96e1-f20d11a72791","Type":"ContainerStarted","Data":"5711ca44f5ec670287423cb3c1170ebd45005145128997819a2b53777d391582"} Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.468610 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6393639-cf22-49a1-96e1-f20d11a72791","Type":"ContainerStarted","Data":"5303d6b2a933936c6d25d384bc8d367e152577b2b791b7ee47e598e7089dc0fa"} Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.474439 4747 generic.go:334] "Generic (PLEG): container finished" podID="69f7e6bc-f06d-475a-8cb9-778ad0105c07" containerID="8782fd1224eaa557958a4b9695e34272e34c2388c1c5158e5889508200fcab14" exitCode=0 Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.474484 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5920-account-create-8htm8" event={"ID":"69f7e6bc-f06d-475a-8cb9-778ad0105c07","Type":"ContainerDied","Data":"8782fd1224eaa557958a4b9695e34272e34c2388c1c5158e5889508200fcab14"} Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.474499 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5920-account-create-8htm8" event={"ID":"69f7e6bc-f06d-475a-8cb9-778ad0105c07","Type":"ContainerStarted","Data":"6385f0ce04a7f0b7ee593063527654ee2cddfad434855d6b5d7954c2e7e51aea"} Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.476916 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b850-account-create-r6fql" event={"ID":"b8273640-d8ff-4820-8762-fe35091c22ff","Type":"ContainerStarted","Data":"475ec8df0e28d8832f0ba3914da65696502b78ac8aec0b802ac63ead83118026"} Sep 30 19:05:44 crc kubenswrapper[4747]: I0930 19:05:44.510325 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-b850-account-create-r6fql" podStartSLOduration=1.510309302 podStartE2EDuration="1.510309302s" podCreationTimestamp="2025-09-30 19:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:44.50466419 +0000 UTC m=+1184.164144294" watchObservedRunningTime="2025-09-30 19:05:44.510309302 +0000 UTC m=+1184.169789416" Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.102705 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f82abbf-1e09-4f46-97c6-6076a5c3a7e0" path="/var/lib/kubelet/pods/9f82abbf-1e09-4f46-97c6-6076a5c3a7e0/volumes" Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.489382 4747 generic.go:334] "Generic (PLEG): container finished" podID="b8273640-d8ff-4820-8762-fe35091c22ff" containerID="665e4d8c3052a81434fd0b951eec2e45c3587a176b6815a8ea0e7af1db7fca63" exitCode=0 Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.489481 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b850-account-create-r6fql" event={"ID":"b8273640-d8ff-4820-8762-fe35091c22ff","Type":"ContainerDied","Data":"665e4d8c3052a81434fd0b951eec2e45c3587a176b6815a8ea0e7af1db7fca63"} Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.491666 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10f77684-656c-4043-b9b4-8e6bfdca1621","Type":"ContainerStarted","Data":"ee8c749d6de49209c0cd5bdc60eb4990c966dc2170601c8bdeefb7891d51e00c"} Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.491703 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10f77684-656c-4043-b9b4-8e6bfdca1621","Type":"ContainerStarted","Data":"2b51b1f44459bc3174b43b59a60c47118f346b0cdaeffaa3f79585e5100a0bd4"} Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.504393 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6393639-cf22-49a1-96e1-f20d11a72791","Type":"ContainerStarted","Data":"aa1b169a2097816271245c27c1688c822eb2496ab72f59bef95192d0bd76be30"} Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.546689 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.546665886 podStartE2EDuration="3.546665886s" podCreationTimestamp="2025-09-30 19:05:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:45.535285309 +0000 UTC m=+1185.194765433" watchObservedRunningTime="2025-09-30 19:05:45.546665886 +0000 UTC m=+1185.206146000" Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.891457 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-acf1-account-create-bq4hx" Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.969329 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7m4s\" (UniqueName: \"kubernetes.io/projected/4be78d88-d62c-4ac0-a07c-0e3e323232c0-kube-api-access-f7m4s\") pod \"4be78d88-d62c-4ac0-a07c-0e3e323232c0\" (UID: \"4be78d88-d62c-4ac0-a07c-0e3e323232c0\") " Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.975123 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be78d88-d62c-4ac0-a07c-0e3e323232c0-kube-api-access-f7m4s" (OuterVolumeSpecName: "kube-api-access-f7m4s") pod "4be78d88-d62c-4ac0-a07c-0e3e323232c0" (UID: "4be78d88-d62c-4ac0-a07c-0e3e323232c0"). InnerVolumeSpecName "kube-api-access-f7m4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:45 crc kubenswrapper[4747]: I0930 19:05:45.980144 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5920-account-create-8htm8" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.071629 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46lf8\" (UniqueName: \"kubernetes.io/projected/69f7e6bc-f06d-475a-8cb9-778ad0105c07-kube-api-access-46lf8\") pod \"69f7e6bc-f06d-475a-8cb9-778ad0105c07\" (UID: \"69f7e6bc-f06d-475a-8cb9-778ad0105c07\") " Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.072368 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7m4s\" (UniqueName: \"kubernetes.io/projected/4be78d88-d62c-4ac0-a07c-0e3e323232c0-kube-api-access-f7m4s\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.074564 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f7e6bc-f06d-475a-8cb9-778ad0105c07-kube-api-access-46lf8" (OuterVolumeSpecName: "kube-api-access-46lf8") pod "69f7e6bc-f06d-475a-8cb9-778ad0105c07" (UID: "69f7e6bc-f06d-475a-8cb9-778ad0105c07"). InnerVolumeSpecName "kube-api-access-46lf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.174162 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46lf8\" (UniqueName: \"kubernetes.io/projected/69f7e6bc-f06d-475a-8cb9-778ad0105c07-kube-api-access-46lf8\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.514289 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10f77684-656c-4043-b9b4-8e6bfdca1621","Type":"ContainerStarted","Data":"e2af8739f1213ae8850f854d3954a116b426ce288c7ecbd5a7d91e2ad26fde90"} Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.515891 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-acf1-account-create-bq4hx" event={"ID":"4be78d88-d62c-4ac0-a07c-0e3e323232c0","Type":"ContainerDied","Data":"4e87a3f1229459ea084f2f6dda784b968abb86973dc43c8743e7d52a9c480bdd"} Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.515949 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e87a3f1229459ea084f2f6dda784b968abb86973dc43c8743e7d52a9c480bdd" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.515958 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-acf1-account-create-bq4hx" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.517560 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5920-account-create-8htm8" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.517511 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5920-account-create-8htm8" event={"ID":"69f7e6bc-f06d-475a-8cb9-778ad0105c07","Type":"ContainerDied","Data":"6385f0ce04a7f0b7ee593063527654ee2cddfad434855d6b5d7954c2e7e51aea"} Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.528602 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6385f0ce04a7f0b7ee593063527654ee2cddfad434855d6b5d7954c2e7e51aea" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.559671 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.559651407 podStartE2EDuration="3.559651407s" podCreationTimestamp="2025-09-30 19:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:05:46.547786686 +0000 UTC m=+1186.207266800" watchObservedRunningTime="2025-09-30 19:05:46.559651407 +0000 UTC m=+1186.219131521" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.843538 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b850-account-create-r6fql" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.886079 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw5sn\" (UniqueName: \"kubernetes.io/projected/b8273640-d8ff-4820-8762-fe35091c22ff-kube-api-access-xw5sn\") pod \"b8273640-d8ff-4820-8762-fe35091c22ff\" (UID: \"b8273640-d8ff-4820-8762-fe35091c22ff\") " Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.892948 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8273640-d8ff-4820-8762-fe35091c22ff-kube-api-access-xw5sn" (OuterVolumeSpecName: "kube-api-access-xw5sn") pod "b8273640-d8ff-4820-8762-fe35091c22ff" (UID: "b8273640-d8ff-4820-8762-fe35091c22ff"). InnerVolumeSpecName "kube-api-access-xw5sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:46 crc kubenswrapper[4747]: I0930 19:05:46.988235 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw5sn\" (UniqueName: \"kubernetes.io/projected/b8273640-d8ff-4820-8762-fe35091c22ff-kube-api-access-xw5sn\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:47 crc kubenswrapper[4747]: I0930 19:05:47.530615 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b850-account-create-r6fql" event={"ID":"b8273640-d8ff-4820-8762-fe35091c22ff","Type":"ContainerDied","Data":"475ec8df0e28d8832f0ba3914da65696502b78ac8aec0b802ac63ead83118026"} Sep 30 19:05:47 crc kubenswrapper[4747]: I0930 19:05:47.530679 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475ec8df0e28d8832f0ba3914da65696502b78ac8aec0b802ac63ead83118026" Sep 30 19:05:47 crc kubenswrapper[4747]: I0930 19:05:47.530644 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b850-account-create-r6fql" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.248950 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ppfv"] Sep 30 19:05:48 crc kubenswrapper[4747]: E0930 19:05:48.249362 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8273640-d8ff-4820-8762-fe35091c22ff" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.249377 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8273640-d8ff-4820-8762-fe35091c22ff" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: E0930 19:05:48.249400 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f7e6bc-f06d-475a-8cb9-778ad0105c07" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.249412 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f7e6bc-f06d-475a-8cb9-778ad0105c07" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: E0930 19:05:48.249439 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be78d88-d62c-4ac0-a07c-0e3e323232c0" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.249449 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be78d88-d62c-4ac0-a07c-0e3e323232c0" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.249664 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be78d88-d62c-4ac0-a07c-0e3e323232c0" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.249684 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8273640-d8ff-4820-8762-fe35091c22ff" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.249708 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f7e6bc-f06d-475a-8cb9-778ad0105c07" containerName="mariadb-account-create" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.250381 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.253511 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.254357 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.257266 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gz2rz" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.271326 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ppfv"] Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.348694 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxs5h\" (UniqueName: \"kubernetes.io/projected/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-kube-api-access-fxs5h\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.348751 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-scripts\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.348774 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-config-data\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.348853 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.450822 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.451308 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxs5h\" (UniqueName: \"kubernetes.io/projected/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-kube-api-access-fxs5h\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.451357 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-scripts\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.451383 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-config-data\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.454583 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-scripts\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.458264 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-config-data\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.461262 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.472529 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxs5h\" (UniqueName: \"kubernetes.io/projected/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-kube-api-access-fxs5h\") pod \"nova-cell0-conductor-db-sync-7ppfv\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.549260 4747 generic.go:334] "Generic (PLEG): container finished" podID="94976787-0fa3-4e10-9902-1918b9512f30" containerID="5ae68ca2574ead6cd58662940d59695ac06f879053d563bebf9d30a87e675907" exitCode=0 Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.549324 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db8dcfd56-rp5sl" event={"ID":"94976787-0fa3-4e10-9902-1918b9512f30","Type":"ContainerDied","Data":"5ae68ca2574ead6cd58662940d59695ac06f879053d563bebf9d30a87e675907"} Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.568987 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.815731 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.820256 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ppfv"] Sep 30 19:05:48 crc kubenswrapper[4747]: W0930 19:05:48.824371 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39451899_d6f8_4b5a_aa40_5e72ee92d8d7.slice/crio-9800f79935b18185816052e79f5a70e333e9a805777313be5ec6d02e4218bbbc WatchSource:0}: Error finding container 9800f79935b18185816052e79f5a70e333e9a805777313be5ec6d02e4218bbbc: Status 404 returned error can't find the container with id 9800f79935b18185816052e79f5a70e333e9a805777313be5ec6d02e4218bbbc Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.961603 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-httpd-config\") pod \"94976787-0fa3-4e10-9902-1918b9512f30\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.961768 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz6c9\" (UniqueName: \"kubernetes.io/projected/94976787-0fa3-4e10-9902-1918b9512f30-kube-api-access-kz6c9\") pod \"94976787-0fa3-4e10-9902-1918b9512f30\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.962061 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-ovndb-tls-certs\") pod \"94976787-0fa3-4e10-9902-1918b9512f30\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.962110 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-config\") pod \"94976787-0fa3-4e10-9902-1918b9512f30\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.962157 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-combined-ca-bundle\") pod \"94976787-0fa3-4e10-9902-1918b9512f30\" (UID: \"94976787-0fa3-4e10-9902-1918b9512f30\") " Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.968885 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "94976787-0fa3-4e10-9902-1918b9512f30" (UID: "94976787-0fa3-4e10-9902-1918b9512f30"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:48 crc kubenswrapper[4747]: I0930 19:05:48.969391 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94976787-0fa3-4e10-9902-1918b9512f30-kube-api-access-kz6c9" (OuterVolumeSpecName: "kube-api-access-kz6c9") pod "94976787-0fa3-4e10-9902-1918b9512f30" (UID: "94976787-0fa3-4e10-9902-1918b9512f30"). InnerVolumeSpecName "kube-api-access-kz6c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.029975 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-config" (OuterVolumeSpecName: "config") pod "94976787-0fa3-4e10-9902-1918b9512f30" (UID: "94976787-0fa3-4e10-9902-1918b9512f30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.038389 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "94976787-0fa3-4e10-9902-1918b9512f30" (UID: "94976787-0fa3-4e10-9902-1918b9512f30"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.052097 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94976787-0fa3-4e10-9902-1918b9512f30" (UID: "94976787-0fa3-4e10-9902-1918b9512f30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.065286 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz6c9\" (UniqueName: \"kubernetes.io/projected/94976787-0fa3-4e10-9902-1918b9512f30-kube-api-access-kz6c9\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.065793 4747 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.065827 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.065858 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.065887 4747 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94976787-0fa3-4e10-9902-1918b9512f30-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.568596 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db8dcfd56-rp5sl" event={"ID":"94976787-0fa3-4e10-9902-1918b9512f30","Type":"ContainerDied","Data":"5f27af16864ce7f68130408cfc1325e974b2fd43c4b0bf46724d4dc4f59be3e4"} Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.568651 4747 scope.go:117] "RemoveContainer" containerID="ff516d0918a894dbef5bd7e6c1cc4a8bdcfe9e38ed9a3aa7c4c55355b6dcdb0b" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.568650 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db8dcfd56-rp5sl" Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.571205 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" event={"ID":"39451899-d6f8-4b5a-aa40-5e72ee92d8d7","Type":"ContainerStarted","Data":"9800f79935b18185816052e79f5a70e333e9a805777313be5ec6d02e4218bbbc"} Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.596808 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db8dcfd56-rp5sl"] Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.602852 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db8dcfd56-rp5sl"] Sep 30 19:05:49 crc kubenswrapper[4747]: I0930 19:05:49.609727 4747 scope.go:117] "RemoveContainer" containerID="5ae68ca2574ead6cd58662940d59695ac06f879053d563bebf9d30a87e675907" Sep 30 19:05:51 crc kubenswrapper[4747]: I0930 19:05:51.118267 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94976787-0fa3-4e10-9902-1918b9512f30" path="/var/lib/kubelet/pods/94976787-0fa3-4e10-9902-1918b9512f30/volumes" Sep 30 19:05:52 crc kubenswrapper[4747]: I0930 19:05:52.807272 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 19:05:52 crc kubenswrapper[4747]: I0930 19:05:52.807549 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 19:05:52 crc kubenswrapper[4747]: I0930 19:05:52.845211 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 19:05:52 crc kubenswrapper[4747]: I0930 19:05:52.862793 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 19:05:53 crc kubenswrapper[4747]: I0930 19:05:53.610165 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 19:05:53 crc kubenswrapper[4747]: I0930 19:05:53.610565 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 19:05:53 crc kubenswrapper[4747]: I0930 19:05:53.881416 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:53 crc kubenswrapper[4747]: I0930 19:05:53.881490 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:53 crc kubenswrapper[4747]: I0930 19:05:53.921056 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:53 crc kubenswrapper[4747]: I0930 19:05:53.935060 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:54 crc kubenswrapper[4747]: I0930 19:05:54.620469 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:54 crc kubenswrapper[4747]: I0930 19:05:54.620551 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:55 crc kubenswrapper[4747]: I0930 19:05:55.447906 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 19:05:55 crc kubenswrapper[4747]: I0930 19:05:55.461967 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 19:05:55 crc kubenswrapper[4747]: I0930 19:05:55.629828 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" event={"ID":"39451899-d6f8-4b5a-aa40-5e72ee92d8d7","Type":"ContainerStarted","Data":"d6c7ecf929760a6ad906bde3a6b844b08e02434bb6f044d8e4abfe175a0d4991"} Sep 30 19:05:55 crc kubenswrapper[4747]: I0930 19:05:55.652759 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" podStartSLOduration=1.382186069 podStartE2EDuration="7.65273883s" podCreationTimestamp="2025-09-30 19:05:48 +0000 UTC" firstStartedPulling="2025-09-30 19:05:48.827028821 +0000 UTC m=+1188.486508935" lastFinishedPulling="2025-09-30 19:05:55.097581582 +0000 UTC m=+1194.757061696" observedRunningTime="2025-09-30 19:05:55.645186493 +0000 UTC m=+1195.304666617" watchObservedRunningTime="2025-09-30 19:05:55.65273883 +0000 UTC m=+1195.312218944" Sep 30 19:05:56 crc kubenswrapper[4747]: I0930 19:05:56.391716 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 19:05:56 crc kubenswrapper[4747]: I0930 19:05:56.435100 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 19:06:04 crc kubenswrapper[4747]: I0930 19:06:04.740599 4747 generic.go:334] "Generic (PLEG): container finished" podID="39451899-d6f8-4b5a-aa40-5e72ee92d8d7" containerID="d6c7ecf929760a6ad906bde3a6b844b08e02434bb6f044d8e4abfe175a0d4991" exitCode=0 Sep 30 19:06:04 crc kubenswrapper[4747]: I0930 19:06:04.740848 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" event={"ID":"39451899-d6f8-4b5a-aa40-5e72ee92d8d7","Type":"ContainerDied","Data":"d6c7ecf929760a6ad906bde3a6b844b08e02434bb6f044d8e4abfe175a0d4991"} Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.198272 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.316469 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-combined-ca-bundle\") pod \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.316535 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-scripts\") pod \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.316593 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxs5h\" (UniqueName: \"kubernetes.io/projected/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-kube-api-access-fxs5h\") pod \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.316742 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-config-data\") pod \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\" (UID: \"39451899-d6f8-4b5a-aa40-5e72ee92d8d7\") " Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.322578 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-scripts" (OuterVolumeSpecName: "scripts") pod "39451899-d6f8-4b5a-aa40-5e72ee92d8d7" (UID: "39451899-d6f8-4b5a-aa40-5e72ee92d8d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.323169 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-kube-api-access-fxs5h" (OuterVolumeSpecName: "kube-api-access-fxs5h") pod "39451899-d6f8-4b5a-aa40-5e72ee92d8d7" (UID: "39451899-d6f8-4b5a-aa40-5e72ee92d8d7"). InnerVolumeSpecName "kube-api-access-fxs5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.356786 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-config-data" (OuterVolumeSpecName: "config-data") pod "39451899-d6f8-4b5a-aa40-5e72ee92d8d7" (UID: "39451899-d6f8-4b5a-aa40-5e72ee92d8d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.359037 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39451899-d6f8-4b5a-aa40-5e72ee92d8d7" (UID: "39451899-d6f8-4b5a-aa40-5e72ee92d8d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.419856 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.419901 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.419939 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxs5h\" (UniqueName: \"kubernetes.io/projected/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-kube-api-access-fxs5h\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.419960 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39451899-d6f8-4b5a-aa40-5e72ee92d8d7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.764382 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" event={"ID":"39451899-d6f8-4b5a-aa40-5e72ee92d8d7","Type":"ContainerDied","Data":"9800f79935b18185816052e79f5a70e333e9a805777313be5ec6d02e4218bbbc"} Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.764855 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9800f79935b18185816052e79f5a70e333e9a805777313be5ec6d02e4218bbbc" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.764507 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ppfv" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.898591 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 19:06:06 crc kubenswrapper[4747]: E0930 19:06:06.899018 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94976787-0fa3-4e10-9902-1918b9512f30" containerName="neutron-api" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.899036 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="94976787-0fa3-4e10-9902-1918b9512f30" containerName="neutron-api" Sep 30 19:06:06 crc kubenswrapper[4747]: E0930 19:06:06.899063 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94976787-0fa3-4e10-9902-1918b9512f30" containerName="neutron-httpd" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.899071 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="94976787-0fa3-4e10-9902-1918b9512f30" containerName="neutron-httpd" Sep 30 19:06:06 crc kubenswrapper[4747]: E0930 19:06:06.899085 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39451899-d6f8-4b5a-aa40-5e72ee92d8d7" containerName="nova-cell0-conductor-db-sync" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.899094 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="39451899-d6f8-4b5a-aa40-5e72ee92d8d7" containerName="nova-cell0-conductor-db-sync" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.899294 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="94976787-0fa3-4e10-9902-1918b9512f30" containerName="neutron-api" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.899312 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="39451899-d6f8-4b5a-aa40-5e72ee92d8d7" containerName="nova-cell0-conductor-db-sync" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.899336 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="94976787-0fa3-4e10-9902-1918b9512f30" containerName="neutron-httpd" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.900096 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.906179 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.906372 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gz2rz" Sep 30 19:06:06 crc kubenswrapper[4747]: I0930 19:06:06.915130 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.032900 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdq6c\" (UniqueName: \"kubernetes.io/projected/26b23eeb-041a-473c-8748-d54b077c81f3-kube-api-access-hdq6c\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.033162 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b23eeb-041a-473c-8748-d54b077c81f3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.033287 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b23eeb-041a-473c-8748-d54b077c81f3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.134429 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdq6c\" (UniqueName: \"kubernetes.io/projected/26b23eeb-041a-473c-8748-d54b077c81f3-kube-api-access-hdq6c\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.139295 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b23eeb-041a-473c-8748-d54b077c81f3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.139392 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b23eeb-041a-473c-8748-d54b077c81f3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.142808 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26b23eeb-041a-473c-8748-d54b077c81f3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.143051 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26b23eeb-041a-473c-8748-d54b077c81f3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.156260 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdq6c\" (UniqueName: \"kubernetes.io/projected/26b23eeb-041a-473c-8748-d54b077c81f3-kube-api-access-hdq6c\") pod \"nova-cell0-conductor-0\" (UID: \"26b23eeb-041a-473c-8748-d54b077c81f3\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.220033 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.518158 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 19:06:07 crc kubenswrapper[4747]: W0930 19:06:07.520854 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26b23eeb_041a_473c_8748_d54b077c81f3.slice/crio-5a881d8f5a82149ad088621b48d2f3872ce882ca30196172f07e9e61479b1011 WatchSource:0}: Error finding container 5a881d8f5a82149ad088621b48d2f3872ce882ca30196172f07e9e61479b1011: Status 404 returned error can't find the container with id 5a881d8f5a82149ad088621b48d2f3872ce882ca30196172f07e9e61479b1011 Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.780509 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"26b23eeb-041a-473c-8748-d54b077c81f3","Type":"ContainerStarted","Data":"c3bbab69cd9c89b7fa8c8361404a7059ba11af3a0e0e7a4d7d0460d3668dcfcb"} Sep 30 19:06:07 crc kubenswrapper[4747]: I0930 19:06:07.780576 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"26b23eeb-041a-473c-8748-d54b077c81f3","Type":"ContainerStarted","Data":"5a881d8f5a82149ad088621b48d2f3872ce882ca30196172f07e9e61479b1011"} Sep 30 19:06:08 crc kubenswrapper[4747]: I0930 19:06:08.791505 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:08 crc kubenswrapper[4747]: I0930 19:06:08.826001 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.82597223 podStartE2EDuration="2.82597223s" podCreationTimestamp="2025-09-30 19:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:08.812785222 +0000 UTC m=+1208.472265366" watchObservedRunningTime="2025-09-30 19:06:08.82597223 +0000 UTC m=+1208.485452384" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.266362 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.835626 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vwjq7"] Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.836961 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.839636 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.839706 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.856484 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vwjq7"] Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.958462 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-scripts\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.958506 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24bh6\" (UniqueName: \"kubernetes.io/projected/3aabdf08-0969-4173-ba76-9e55ba35150a-kube-api-access-24bh6\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.958629 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.959038 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-config-data\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:12 crc kubenswrapper[4747]: I0930 19:06:12.980561 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.007148 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.009828 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.053772 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.063886 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.063953 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-config-data\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.064000 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-logs\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.064044 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx7r2\" (UniqueName: \"kubernetes.io/projected/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-kube-api-access-rx7r2\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.064077 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-config-data\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.064122 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-scripts\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.064138 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.064158 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24bh6\" (UniqueName: \"kubernetes.io/projected/3aabdf08-0969-4173-ba76-9e55ba35150a-kube-api-access-24bh6\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.074662 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-config-data\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.074856 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-scripts\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.084644 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.087791 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24bh6\" (UniqueName: \"kubernetes.io/projected/3aabdf08-0969-4173-ba76-9e55ba35150a-kube-api-access-24bh6\") pod \"nova-cell0-cell-mapping-vwjq7\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.107696 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.109599 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.113170 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.119468 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.158809 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.165628 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-config-data\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.165722 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-config-data\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.165770 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-logs\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.165862 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx7r2\" (UniqueName: \"kubernetes.io/projected/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-kube-api-access-rx7r2\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.165918 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.166060 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.166088 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkxnp\" (UniqueName: \"kubernetes.io/projected/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-kube-api-access-vkxnp\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.168889 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-logs\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.183553 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.190204 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-config-data\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.191794 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx7r2\" (UniqueName: \"kubernetes.io/projected/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-kube-api-access-rx7r2\") pod \"nova-api-0\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.199530 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.221963 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.236653 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.242436 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.252985 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.254232 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.261314 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.266052 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.267666 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4lhw\" (UniqueName: \"kubernetes.io/projected/16b7fdc7-0f23-4186-ba14-e167730760b2-kube-api-access-l4lhw\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.267726 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkxnp\" (UniqueName: \"kubernetes.io/projected/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-kube-api-access-vkxnp\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.267795 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-config-data\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.267827 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-config-data\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.267896 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.267955 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16b7fdc7-0f23-4186-ba14-e167730760b2-logs\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.267975 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.287780 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.290645 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-config-data\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.308866 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkxnp\" (UniqueName: \"kubernetes.io/projected/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-kube-api-access-vkxnp\") pod \"nova-scheduler-0\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.352844 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cbc9bd96f-m9fjq"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.354627 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.356675 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.363093 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbc9bd96f-m9fjq"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.372749 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.375796 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46tm\" (UniqueName: \"kubernetes.io/projected/7322c2b5-665e-4f0f-9680-002ba350622c-kube-api-access-j46tm\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.377036 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgzq6\" (UniqueName: \"kubernetes.io/projected/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-kube-api-access-bgzq6\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.377106 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-config\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.377129 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.377184 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16b7fdc7-0f23-4186-ba14-e167730760b2-logs\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.377204 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.377275 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.381771 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4lhw\" (UniqueName: \"kubernetes.io/projected/16b7fdc7-0f23-4186-ba14-e167730760b2-kube-api-access-l4lhw\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.381915 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-dns-svc\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.382112 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.383768 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-config-data\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.385630 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.389181 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-config-data\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.391730 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16b7fdc7-0f23-4186-ba14-e167730760b2-logs\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.407132 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4lhw\" (UniqueName: \"kubernetes.io/projected/16b7fdc7-0f23-4186-ba14-e167730760b2-kube-api-access-l4lhw\") pod \"nova-metadata-0\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.474131 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.492814 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.492916 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.492972 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46tm\" (UniqueName: \"kubernetes.io/projected/7322c2b5-665e-4f0f-9680-002ba350622c-kube-api-access-j46tm\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.493055 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgzq6\" (UniqueName: \"kubernetes.io/projected/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-kube-api-access-bgzq6\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.493088 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-config\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.493106 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.493181 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.493232 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-dns-svc\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.493716 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.493977 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-dns-svc\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.494024 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.494272 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-config\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.497175 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.502150 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.510211 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgzq6\" (UniqueName: \"kubernetes.io/projected/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-kube-api-access-bgzq6\") pod \"nova-cell1-novncproxy-0\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.511183 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46tm\" (UniqueName: \"kubernetes.io/projected/7322c2b5-665e-4f0f-9680-002ba350622c-kube-api-access-j46tm\") pod \"dnsmasq-dns-5cbc9bd96f-m9fjq\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.628826 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.638523 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.676446 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.771353 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.807816 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vwjq7"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.872422 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3","Type":"ContainerStarted","Data":"d48f2d9e5325413497f58077afc94f0c06997f001737ffeda048fd71e9a42201"} Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.873303 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vwjq7" event={"ID":"3aabdf08-0969-4173-ba76-9e55ba35150a","Type":"ContainerStarted","Data":"144234dcd210363c2306a98603cf37a77d02dbbec84bf22ff86f5589fc710c3c"} Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.882120 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.955689 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-krqdj"] Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.956769 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.964200 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.964367 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 19:06:13 crc kubenswrapper[4747]: I0930 19:06:13.967238 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-krqdj"] Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.002890 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-config-data\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.002962 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.002981 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-scripts\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.003029 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2sc7\" (UniqueName: \"kubernetes.io/projected/85ca5396-cd5d-4c87-930f-0891e66a5613-kube-api-access-m2sc7\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.104810 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-config-data\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.105125 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.105141 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-scripts\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.105185 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2sc7\" (UniqueName: \"kubernetes.io/projected/85ca5396-cd5d-4c87-930f-0891e66a5613-kube-api-access-m2sc7\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.109618 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-config-data\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.111505 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.114803 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-scripts\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.131774 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2sc7\" (UniqueName: \"kubernetes.io/projected/85ca5396-cd5d-4c87-930f-0891e66a5613-kube-api-access-m2sc7\") pod \"nova-cell1-conductor-db-sync-krqdj\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: W0930 19:06:14.231386 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19e0ae2f_1fde_42db_8386_4ebbe8dcd883.slice/crio-4f943d465213c9ada3732a358593b465881a8c455bd3148ca8403cee557944f7 WatchSource:0}: Error finding container 4f943d465213c9ada3732a358593b465881a8c455bd3148ca8403cee557944f7: Status 404 returned error can't find the container with id 4f943d465213c9ada3732a358593b465881a8c455bd3148ca8403cee557944f7 Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.239243 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.252034 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.286136 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbc9bd96f-m9fjq"] Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.353564 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.908788 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-krqdj"] Sep 30 19:06:14 crc kubenswrapper[4747]: W0930 19:06:14.926085 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ca5396_cd5d_4c87_930f_0891e66a5613.slice/crio-5490b193c883747c70271e13f6282a66d3727ad3c5d0c101de7a3abf8839ae42 WatchSource:0}: Error finding container 5490b193c883747c70271e13f6282a66d3727ad3c5d0c101de7a3abf8839ae42: Status 404 returned error can't find the container with id 5490b193c883747c70271e13f6282a66d3727ad3c5d0c101de7a3abf8839ae42 Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.949293 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vwjq7" event={"ID":"3aabdf08-0969-4173-ba76-9e55ba35150a","Type":"ContainerStarted","Data":"15f0a5e64f9e3c353b72d8e5d70bcf93ca02dc298bb54cf3639c6a8d482bcc0b"} Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.956580 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3","Type":"ContainerStarted","Data":"89b7fa9ea706527b8edd4650b6c832a1d385c334386360180854ab0511d59d82"} Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.957498 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"19e0ae2f-1fde-42db-8386-4ebbe8dcd883","Type":"ContainerStarted","Data":"4f943d465213c9ada3732a358593b465881a8c455bd3148ca8403cee557944f7"} Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.958539 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16b7fdc7-0f23-4186-ba14-e167730760b2","Type":"ContainerStarted","Data":"2058838a159e7b034f248a40be9bb7042b5c14af7dddc6e97228c1686e1a4182"} Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.959805 4747 generic.go:334] "Generic (PLEG): container finished" podID="7322c2b5-665e-4f0f-9680-002ba350622c" containerID="37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0" exitCode=0 Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.961011 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" event={"ID":"7322c2b5-665e-4f0f-9680-002ba350622c","Type":"ContainerDied","Data":"37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0"} Sep 30 19:06:14 crc kubenswrapper[4747]: I0930 19:06:14.961063 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" event={"ID":"7322c2b5-665e-4f0f-9680-002ba350622c","Type":"ContainerStarted","Data":"efd414a9ef4c673ebdab4361be5476a28f45a0d4f33507f325e00be55809e2e6"} Sep 30 19:06:15 crc kubenswrapper[4747]: I0930 19:06:15.030020 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vwjq7" podStartSLOduration=3.029997632 podStartE2EDuration="3.029997632s" podCreationTimestamp="2025-09-30 19:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:14.971006149 +0000 UTC m=+1214.630486263" watchObservedRunningTime="2025-09-30 19:06:15.029997632 +0000 UTC m=+1214.689477746" Sep 30 19:06:15 crc kubenswrapper[4747]: I0930 19:06:15.970205 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-krqdj" event={"ID":"85ca5396-cd5d-4c87-930f-0891e66a5613","Type":"ContainerStarted","Data":"5ce8c55fe8c8c4997a73a025688c8da411856bdb754446af35f9fe2f1f23976d"} Sep 30 19:06:15 crc kubenswrapper[4747]: I0930 19:06:15.970623 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-krqdj" event={"ID":"85ca5396-cd5d-4c87-930f-0891e66a5613","Type":"ContainerStarted","Data":"5490b193c883747c70271e13f6282a66d3727ad3c5d0c101de7a3abf8839ae42"} Sep 30 19:06:15 crc kubenswrapper[4747]: I0930 19:06:15.976801 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" event={"ID":"7322c2b5-665e-4f0f-9680-002ba350622c","Type":"ContainerStarted","Data":"4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270"} Sep 30 19:06:15 crc kubenswrapper[4747]: I0930 19:06:15.976829 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:15 crc kubenswrapper[4747]: I0930 19:06:15.990759 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-krqdj" podStartSLOduration=2.990745463 podStartE2EDuration="2.990745463s" podCreationTimestamp="2025-09-30 19:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:15.988125308 +0000 UTC m=+1215.647605442" watchObservedRunningTime="2025-09-30 19:06:15.990745463 +0000 UTC m=+1215.650225577" Sep 30 19:06:16 crc kubenswrapper[4747]: I0930 19:06:16.009918 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" podStartSLOduration=3.009889103 podStartE2EDuration="3.009889103s" podCreationTimestamp="2025-09-30 19:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:16.007657029 +0000 UTC m=+1215.667137173" watchObservedRunningTime="2025-09-30 19:06:16.009889103 +0000 UTC m=+1215.669369247" Sep 30 19:06:16 crc kubenswrapper[4747]: I0930 19:06:16.842391 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:16 crc kubenswrapper[4747]: I0930 19:06:16.859012 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:17 crc kubenswrapper[4747]: I0930 19:06:17.992598 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3","Type":"ContainerStarted","Data":"5cd105345704f18a4d79fda424e5cdf5486cfafeec3e47aeba54817ba027a1c0"} Sep 30 19:06:17 crc kubenswrapper[4747]: I0930 19:06:17.994687 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3","Type":"ContainerStarted","Data":"fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11"} Sep 30 19:06:17 crc kubenswrapper[4747]: I0930 19:06:17.994729 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3","Type":"ContainerStarted","Data":"f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120"} Sep 30 19:06:17 crc kubenswrapper[4747]: I0930 19:06:17.997692 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"19e0ae2f-1fde-42db-8386-4ebbe8dcd883","Type":"ContainerStarted","Data":"fe38fd93c43851379580425b30ed9e4910031c92756ba0b704ebaf573ed1c8aa"} Sep 30 19:06:17 crc kubenswrapper[4747]: I0930 19:06:17.997776 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="19e0ae2f-1fde-42db-8386-4ebbe8dcd883" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://fe38fd93c43851379580425b30ed9e4910031c92756ba0b704ebaf573ed1c8aa" gracePeriod=30 Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.000221 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16b7fdc7-0f23-4186-ba14-e167730760b2","Type":"ContainerStarted","Data":"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208"} Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.000262 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16b7fdc7-0f23-4186-ba14-e167730760b2","Type":"ContainerStarted","Data":"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a"} Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.000367 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerName="nova-metadata-log" containerID="cri-o://8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a" gracePeriod=30 Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.000514 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerName="nova-metadata-metadata" containerID="cri-o://5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208" gracePeriod=30 Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.012940 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.6434248729999998 podStartE2EDuration="5.012909458s" podCreationTimestamp="2025-09-30 19:06:13 +0000 UTC" firstStartedPulling="2025-09-30 19:06:13.791207177 +0000 UTC m=+1213.450687291" lastFinishedPulling="2025-09-30 19:06:17.160691762 +0000 UTC m=+1216.820171876" observedRunningTime="2025-09-30 19:06:18.010330154 +0000 UTC m=+1217.669810278" watchObservedRunningTime="2025-09-30 19:06:18.012909458 +0000 UTC m=+1217.672389572" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.036196 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.04804231 podStartE2EDuration="5.036171716s" podCreationTimestamp="2025-09-30 19:06:13 +0000 UTC" firstStartedPulling="2025-09-30 19:06:14.239014414 +0000 UTC m=+1213.898494528" lastFinishedPulling="2025-09-30 19:06:17.22714382 +0000 UTC m=+1216.886623934" observedRunningTime="2025-09-30 19:06:18.031119051 +0000 UTC m=+1217.690599175" watchObservedRunningTime="2025-09-30 19:06:18.036171716 +0000 UTC m=+1217.695651830" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.054547 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.775778634 podStartE2EDuration="6.054524093s" podCreationTimestamp="2025-09-30 19:06:12 +0000 UTC" firstStartedPulling="2025-09-30 19:06:13.890053725 +0000 UTC m=+1213.549533839" lastFinishedPulling="2025-09-30 19:06:17.168799144 +0000 UTC m=+1216.828279298" observedRunningTime="2025-09-30 19:06:18.049391046 +0000 UTC m=+1217.708871160" watchObservedRunningTime="2025-09-30 19:06:18.054524093 +0000 UTC m=+1217.714004217" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.074444 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.137992602 podStartE2EDuration="5.074421994s" podCreationTimestamp="2025-09-30 19:06:13 +0000 UTC" firstStartedPulling="2025-09-30 19:06:14.230672534 +0000 UTC m=+1213.890152638" lastFinishedPulling="2025-09-30 19:06:17.167101876 +0000 UTC m=+1216.826582030" observedRunningTime="2025-09-30 19:06:18.06798945 +0000 UTC m=+1217.727469564" watchObservedRunningTime="2025-09-30 19:06:18.074421994 +0000 UTC m=+1217.733902108" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.478377 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.629469 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.629510 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.640219 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.643075 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.707541 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-config-data\") pod \"16b7fdc7-0f23-4186-ba14-e167730760b2\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.707594 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16b7fdc7-0f23-4186-ba14-e167730760b2-logs\") pod \"16b7fdc7-0f23-4186-ba14-e167730760b2\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.707660 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-combined-ca-bundle\") pod \"16b7fdc7-0f23-4186-ba14-e167730760b2\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.707687 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4lhw\" (UniqueName: \"kubernetes.io/projected/16b7fdc7-0f23-4186-ba14-e167730760b2-kube-api-access-l4lhw\") pod \"16b7fdc7-0f23-4186-ba14-e167730760b2\" (UID: \"16b7fdc7-0f23-4186-ba14-e167730760b2\") " Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.709247 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b7fdc7-0f23-4186-ba14-e167730760b2-logs" (OuterVolumeSpecName: "logs") pod "16b7fdc7-0f23-4186-ba14-e167730760b2" (UID: "16b7fdc7-0f23-4186-ba14-e167730760b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.714494 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b7fdc7-0f23-4186-ba14-e167730760b2-kube-api-access-l4lhw" (OuterVolumeSpecName: "kube-api-access-l4lhw") pod "16b7fdc7-0f23-4186-ba14-e167730760b2" (UID: "16b7fdc7-0f23-4186-ba14-e167730760b2"). InnerVolumeSpecName "kube-api-access-l4lhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.746323 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16b7fdc7-0f23-4186-ba14-e167730760b2" (UID: "16b7fdc7-0f23-4186-ba14-e167730760b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.755979 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-config-data" (OuterVolumeSpecName: "config-data") pod "16b7fdc7-0f23-4186-ba14-e167730760b2" (UID: "16b7fdc7-0f23-4186-ba14-e167730760b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.809802 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.809833 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16b7fdc7-0f23-4186-ba14-e167730760b2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.809842 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b7fdc7-0f23-4186-ba14-e167730760b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:18 crc kubenswrapper[4747]: I0930 19:06:18.809853 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4lhw\" (UniqueName: \"kubernetes.io/projected/16b7fdc7-0f23-4186-ba14-e167730760b2-kube-api-access-l4lhw\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.013002 4747 generic.go:334] "Generic (PLEG): container finished" podID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerID="5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208" exitCode=0 Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.013065 4747 generic.go:334] "Generic (PLEG): container finished" podID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerID="8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a" exitCode=143 Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.013087 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.013112 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16b7fdc7-0f23-4186-ba14-e167730760b2","Type":"ContainerDied","Data":"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208"} Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.013195 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16b7fdc7-0f23-4186-ba14-e167730760b2","Type":"ContainerDied","Data":"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a"} Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.013217 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16b7fdc7-0f23-4186-ba14-e167730760b2","Type":"ContainerDied","Data":"2058838a159e7b034f248a40be9bb7042b5c14af7dddc6e97228c1686e1a4182"} Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.013247 4747 scope.go:117] "RemoveContainer" containerID="5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.035295 4747 scope.go:117] "RemoveContainer" containerID="8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.049809 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.057821 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.072287 4747 scope.go:117] "RemoveContainer" containerID="5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208" Sep 30 19:06:19 crc kubenswrapper[4747]: E0930 19:06:19.072717 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208\": container with ID starting with 5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208 not found: ID does not exist" containerID="5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.072749 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208"} err="failed to get container status \"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208\": rpc error: code = NotFound desc = could not find container \"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208\": container with ID starting with 5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208 not found: ID does not exist" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.072770 4747 scope.go:117] "RemoveContainer" containerID="8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a" Sep 30 19:06:19 crc kubenswrapper[4747]: E0930 19:06:19.073055 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a\": container with ID starting with 8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a not found: ID does not exist" containerID="8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.073079 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a"} err="failed to get container status \"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a\": rpc error: code = NotFound desc = could not find container \"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a\": container with ID starting with 8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a not found: ID does not exist" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.073094 4747 scope.go:117] "RemoveContainer" containerID="5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.073367 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208"} err="failed to get container status \"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208\": rpc error: code = NotFound desc = could not find container \"5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208\": container with ID starting with 5a76a52600dd920e70a049c0e55ba906ac9e1649160bb3a95f1db81787903208 not found: ID does not exist" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.073418 4747 scope.go:117] "RemoveContainer" containerID="8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.073708 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a"} err="failed to get container status \"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a\": rpc error: code = NotFound desc = could not find container \"8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a\": container with ID starting with 8eda61ae09aeb43e2f9654d0e8bff82138ec9712e3d15cd119eb88d9e124353a not found: ID does not exist" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.077322 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:19 crc kubenswrapper[4747]: E0930 19:06:19.077766 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerName="nova-metadata-log" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.077787 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerName="nova-metadata-log" Sep 30 19:06:19 crc kubenswrapper[4747]: E0930 19:06:19.077825 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerName="nova-metadata-metadata" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.077836 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerName="nova-metadata-metadata" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.078076 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerName="nova-metadata-log" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.078104 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" containerName="nova-metadata-metadata" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.079243 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.081735 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.082302 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.104513 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b7fdc7-0f23-4186-ba14-e167730760b2" path="/var/lib/kubelet/pods/16b7fdc7-0f23-4186-ba14-e167730760b2/volumes" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.105306 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.115001 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.115072 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d374a6-59e4-4435-b35c-ad5358e8d507-logs\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.115091 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qzb\" (UniqueName: \"kubernetes.io/projected/69d374a6-59e4-4435-b35c-ad5358e8d507-kube-api-access-r7qzb\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.115179 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-config-data\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.115197 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.217099 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.217171 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d374a6-59e4-4435-b35c-ad5358e8d507-logs\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.217195 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qzb\" (UniqueName: \"kubernetes.io/projected/69d374a6-59e4-4435-b35c-ad5358e8d507-kube-api-access-r7qzb\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.217264 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-config-data\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.217290 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.217686 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d374a6-59e4-4435-b35c-ad5358e8d507-logs\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.224737 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.229458 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.233640 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-config-data\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.247524 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qzb\" (UniqueName: \"kubernetes.io/projected/69d374a6-59e4-4435-b35c-ad5358e8d507-kube-api-access-r7qzb\") pod \"nova-metadata-0\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.401705 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:19 crc kubenswrapper[4747]: I0930 19:06:19.882206 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:20 crc kubenswrapper[4747]: I0930 19:06:20.027334 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69d374a6-59e4-4435-b35c-ad5358e8d507","Type":"ContainerStarted","Data":"d3ddaacf3e18d86a789f822c99749d2d3f1af2c655263dc56e1d63a985e0888d"} Sep 30 19:06:21 crc kubenswrapper[4747]: I0930 19:06:21.037247 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69d374a6-59e4-4435-b35c-ad5358e8d507","Type":"ContainerStarted","Data":"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0"} Sep 30 19:06:21 crc kubenswrapper[4747]: I0930 19:06:21.037756 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69d374a6-59e4-4435-b35c-ad5358e8d507","Type":"ContainerStarted","Data":"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84"} Sep 30 19:06:21 crc kubenswrapper[4747]: I0930 19:06:21.063652 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.063628881 podStartE2EDuration="2.063628881s" podCreationTimestamp="2025-09-30 19:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:21.056527847 +0000 UTC m=+1220.716007961" watchObservedRunningTime="2025-09-30 19:06:21.063628881 +0000 UTC m=+1220.723109005" Sep 30 19:06:22 crc kubenswrapper[4747]: I0930 19:06:22.049242 4747 generic.go:334] "Generic (PLEG): container finished" podID="3aabdf08-0969-4173-ba76-9e55ba35150a" containerID="15f0a5e64f9e3c353b72d8e5d70bcf93ca02dc298bb54cf3639c6a8d482bcc0b" exitCode=0 Sep 30 19:06:22 crc kubenswrapper[4747]: I0930 19:06:22.049310 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vwjq7" event={"ID":"3aabdf08-0969-4173-ba76-9e55ba35150a","Type":"ContainerDied","Data":"15f0a5e64f9e3c353b72d8e5d70bcf93ca02dc298bb54cf3639c6a8d482bcc0b"} Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.061407 4747 generic.go:334] "Generic (PLEG): container finished" podID="85ca5396-cd5d-4c87-930f-0891e66a5613" containerID="5ce8c55fe8c8c4997a73a025688c8da411856bdb754446af35f9fe2f1f23976d" exitCode=0 Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.061501 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-krqdj" event={"ID":"85ca5396-cd5d-4c87-930f-0891e66a5613","Type":"ContainerDied","Data":"5ce8c55fe8c8c4997a73a025688c8da411856bdb754446af35f9fe2f1f23976d"} Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.357441 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.357496 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.475328 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.497527 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.507591 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.606575 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-scripts\") pod \"3aabdf08-0969-4173-ba76-9e55ba35150a\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.606618 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-config-data\") pod \"3aabdf08-0969-4173-ba76-9e55ba35150a\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.606773 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24bh6\" (UniqueName: \"kubernetes.io/projected/3aabdf08-0969-4173-ba76-9e55ba35150a-kube-api-access-24bh6\") pod \"3aabdf08-0969-4173-ba76-9e55ba35150a\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.606841 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-combined-ca-bundle\") pod \"3aabdf08-0969-4173-ba76-9e55ba35150a\" (UID: \"3aabdf08-0969-4173-ba76-9e55ba35150a\") " Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.612874 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-scripts" (OuterVolumeSpecName: "scripts") pod "3aabdf08-0969-4173-ba76-9e55ba35150a" (UID: "3aabdf08-0969-4173-ba76-9e55ba35150a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.621129 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aabdf08-0969-4173-ba76-9e55ba35150a-kube-api-access-24bh6" (OuterVolumeSpecName: "kube-api-access-24bh6") pod "3aabdf08-0969-4173-ba76-9e55ba35150a" (UID: "3aabdf08-0969-4173-ba76-9e55ba35150a"). InnerVolumeSpecName "kube-api-access-24bh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.637715 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-config-data" (OuterVolumeSpecName: "config-data") pod "3aabdf08-0969-4173-ba76-9e55ba35150a" (UID: "3aabdf08-0969-4173-ba76-9e55ba35150a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.640180 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aabdf08-0969-4173-ba76-9e55ba35150a" (UID: "3aabdf08-0969-4173-ba76-9e55ba35150a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.678088 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.717376 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.717413 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.717426 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24bh6\" (UniqueName: \"kubernetes.io/projected/3aabdf08-0969-4173-ba76-9e55ba35150a-kube-api-access-24bh6\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.717438 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aabdf08-0969-4173-ba76-9e55ba35150a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.742200 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85fcfdb47c-pn5mf"] Sep 30 19:06:23 crc kubenswrapper[4747]: I0930 19:06:23.742554 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" podUID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" containerName="dnsmasq-dns" containerID="cri-o://8692a22bd0d14ab0dd6873d5d22e345419a6b0ec7bdbc6de2f34d9b2874554b4" gracePeriod=10 Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.079200 4747 generic.go:334] "Generic (PLEG): container finished" podID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" containerID="8692a22bd0d14ab0dd6873d5d22e345419a6b0ec7bdbc6de2f34d9b2874554b4" exitCode=0 Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.079295 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" event={"ID":"095a8eef-72de-4bd9-a435-c2b7ef2e832e","Type":"ContainerDied","Data":"8692a22bd0d14ab0dd6873d5d22e345419a6b0ec7bdbc6de2f34d9b2874554b4"} Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.084940 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vwjq7" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.090770 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vwjq7" event={"ID":"3aabdf08-0969-4173-ba76-9e55ba35150a","Type":"ContainerDied","Data":"144234dcd210363c2306a98603cf37a77d02dbbec84bf22ff86f5589fc710c3c"} Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.090815 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="144234dcd210363c2306a98603cf37a77d02dbbec84bf22ff86f5589fc710c3c" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.123723 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.188776 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.189055 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-log" containerID="cri-o://f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120" gracePeriod=30 Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.189418 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-api" containerID="cri-o://fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11" gracePeriod=30 Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.189705 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.194882 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.164:8774/\": EOF" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.195161 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.164:8774/\": EOF" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.215048 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.236941 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-dns-svc\") pod \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.237033 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-sb\") pod \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.237076 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx95d\" (UniqueName: \"kubernetes.io/projected/095a8eef-72de-4bd9-a435-c2b7ef2e832e-kube-api-access-nx95d\") pod \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.237191 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-config\") pod \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.237217 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-nb\") pod \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\" (UID: \"095a8eef-72de-4bd9-a435-c2b7ef2e832e\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.249223 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.249531 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerName="nova-metadata-log" containerID="cri-o://7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84" gracePeriod=30 Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.250017 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerName="nova-metadata-metadata" containerID="cri-o://7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0" gracePeriod=30 Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.252218 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095a8eef-72de-4bd9-a435-c2b7ef2e832e-kube-api-access-nx95d" (OuterVolumeSpecName: "kube-api-access-nx95d") pod "095a8eef-72de-4bd9-a435-c2b7ef2e832e" (UID: "095a8eef-72de-4bd9-a435-c2b7ef2e832e"). InnerVolumeSpecName "kube-api-access-nx95d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.306766 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "095a8eef-72de-4bd9-a435-c2b7ef2e832e" (UID: "095a8eef-72de-4bd9-a435-c2b7ef2e832e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.314618 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "095a8eef-72de-4bd9-a435-c2b7ef2e832e" (UID: "095a8eef-72de-4bd9-a435-c2b7ef2e832e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.318445 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "095a8eef-72de-4bd9-a435-c2b7ef2e832e" (UID: "095a8eef-72de-4bd9-a435-c2b7ef2e832e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.336401 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-config" (OuterVolumeSpecName: "config") pod "095a8eef-72de-4bd9-a435-c2b7ef2e832e" (UID: "095a8eef-72de-4bd9-a435-c2b7ef2e832e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.339750 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.339770 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx95d\" (UniqueName: \"kubernetes.io/projected/095a8eef-72de-4bd9-a435-c2b7ef2e832e-kube-api-access-nx95d\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.339781 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.339789 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.339796 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/095a8eef-72de-4bd9-a435-c2b7ef2e832e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.405523 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.405568 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.560989 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.646064 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2sc7\" (UniqueName: \"kubernetes.io/projected/85ca5396-cd5d-4c87-930f-0891e66a5613-kube-api-access-m2sc7\") pod \"85ca5396-cd5d-4c87-930f-0891e66a5613\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.646220 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-scripts\") pod \"85ca5396-cd5d-4c87-930f-0891e66a5613\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.646331 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-combined-ca-bundle\") pod \"85ca5396-cd5d-4c87-930f-0891e66a5613\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.646348 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-config-data\") pod \"85ca5396-cd5d-4c87-930f-0891e66a5613\" (UID: \"85ca5396-cd5d-4c87-930f-0891e66a5613\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.652064 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-scripts" (OuterVolumeSpecName: "scripts") pod "85ca5396-cd5d-4c87-930f-0891e66a5613" (UID: "85ca5396-cd5d-4c87-930f-0891e66a5613"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.672510 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ca5396-cd5d-4c87-930f-0891e66a5613-kube-api-access-m2sc7" (OuterVolumeSpecName: "kube-api-access-m2sc7") pod "85ca5396-cd5d-4c87-930f-0891e66a5613" (UID: "85ca5396-cd5d-4c87-930f-0891e66a5613"). InnerVolumeSpecName "kube-api-access-m2sc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.686195 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-config-data" (OuterVolumeSpecName: "config-data") pod "85ca5396-cd5d-4c87-930f-0891e66a5613" (UID: "85ca5396-cd5d-4c87-930f-0891e66a5613"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.709145 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85ca5396-cd5d-4c87-930f-0891e66a5613" (UID: "85ca5396-cd5d-4c87-930f-0891e66a5613"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.748696 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.749038 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.749049 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ca5396-cd5d-4c87-930f-0891e66a5613-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.749064 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2sc7\" (UniqueName: \"kubernetes.io/projected/85ca5396-cd5d-4c87-930f-0891e66a5613-kube-api-access-m2sc7\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.803115 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.849993 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d374a6-59e4-4435-b35c-ad5358e8d507-logs\") pod \"69d374a6-59e4-4435-b35c-ad5358e8d507\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.850092 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-nova-metadata-tls-certs\") pod \"69d374a6-59e4-4435-b35c-ad5358e8d507\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.850199 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-combined-ca-bundle\") pod \"69d374a6-59e4-4435-b35c-ad5358e8d507\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.850270 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-config-data\") pod \"69d374a6-59e4-4435-b35c-ad5358e8d507\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.850358 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7qzb\" (UniqueName: \"kubernetes.io/projected/69d374a6-59e4-4435-b35c-ad5358e8d507-kube-api-access-r7qzb\") pod \"69d374a6-59e4-4435-b35c-ad5358e8d507\" (UID: \"69d374a6-59e4-4435-b35c-ad5358e8d507\") " Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.864260 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69d374a6-59e4-4435-b35c-ad5358e8d507-logs" (OuterVolumeSpecName: "logs") pod "69d374a6-59e4-4435-b35c-ad5358e8d507" (UID: "69d374a6-59e4-4435-b35c-ad5358e8d507"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.868146 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d374a6-59e4-4435-b35c-ad5358e8d507-kube-api-access-r7qzb" (OuterVolumeSpecName: "kube-api-access-r7qzb") pod "69d374a6-59e4-4435-b35c-ad5358e8d507" (UID: "69d374a6-59e4-4435-b35c-ad5358e8d507"). InnerVolumeSpecName "kube-api-access-r7qzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.915097 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69d374a6-59e4-4435-b35c-ad5358e8d507" (UID: "69d374a6-59e4-4435-b35c-ad5358e8d507"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.916035 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-config-data" (OuterVolumeSpecName: "config-data") pod "69d374a6-59e4-4435-b35c-ad5358e8d507" (UID: "69d374a6-59e4-4435-b35c-ad5358e8d507"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.924011 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "69d374a6-59e4-4435-b35c-ad5358e8d507" (UID: "69d374a6-59e4-4435-b35c-ad5358e8d507"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.952310 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.952345 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7qzb\" (UniqueName: \"kubernetes.io/projected/69d374a6-59e4-4435-b35c-ad5358e8d507-kube-api-access-r7qzb\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.952358 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d374a6-59e4-4435-b35c-ad5358e8d507-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.952366 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:24 crc kubenswrapper[4747]: I0930 19:06:24.952375 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69d374a6-59e4-4435-b35c-ad5358e8d507-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.101763 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" event={"ID":"095a8eef-72de-4bd9-a435-c2b7ef2e832e","Type":"ContainerDied","Data":"888a137d279163f15def566c4ebb9ed2992074b4653480d6f44e4ed2e87e70bf"} Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.101845 4747 scope.go:117] "RemoveContainer" containerID="8692a22bd0d14ab0dd6873d5d22e345419a6b0ec7bdbc6de2f34d9b2874554b4" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.102057 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85fcfdb47c-pn5mf" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.115256 4747 generic.go:334] "Generic (PLEG): container finished" podID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerID="7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0" exitCode=0 Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.115526 4747 generic.go:334] "Generic (PLEG): container finished" podID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerID="7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84" exitCode=143 Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.115687 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.118389 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69d374a6-59e4-4435-b35c-ad5358e8d507","Type":"ContainerDied","Data":"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0"} Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.118452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69d374a6-59e4-4435-b35c-ad5358e8d507","Type":"ContainerDied","Data":"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84"} Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.118464 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"69d374a6-59e4-4435-b35c-ad5358e8d507","Type":"ContainerDied","Data":"d3ddaacf3e18d86a789f822c99749d2d3f1af2c655263dc56e1d63a985e0888d"} Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.126070 4747 generic.go:334] "Generic (PLEG): container finished" podID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerID="f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120" exitCode=143 Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.126148 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3","Type":"ContainerDied","Data":"f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120"} Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.137862 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-krqdj" event={"ID":"85ca5396-cd5d-4c87-930f-0891e66a5613","Type":"ContainerDied","Data":"5490b193c883747c70271e13f6282a66d3727ad3c5d0c101de7a3abf8839ae42"} Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.137897 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-krqdj" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.137913 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5490b193c883747c70271e13f6282a66d3727ad3c5d0c101de7a3abf8839ae42" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.163196 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 19:06:25 crc kubenswrapper[4747]: E0930 19:06:25.163867 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ca5396-cd5d-4c87-930f-0891e66a5613" containerName="nova-cell1-conductor-db-sync" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.163886 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ca5396-cd5d-4c87-930f-0891e66a5613" containerName="nova-cell1-conductor-db-sync" Sep 30 19:06:25 crc kubenswrapper[4747]: E0930 19:06:25.163907 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerName="nova-metadata-log" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.163915 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerName="nova-metadata-log" Sep 30 19:06:25 crc kubenswrapper[4747]: E0930 19:06:25.164034 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aabdf08-0969-4173-ba76-9e55ba35150a" containerName="nova-manage" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164046 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aabdf08-0969-4173-ba76-9e55ba35150a" containerName="nova-manage" Sep 30 19:06:25 crc kubenswrapper[4747]: E0930 19:06:25.164059 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" containerName="dnsmasq-dns" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164067 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" containerName="dnsmasq-dns" Sep 30 19:06:25 crc kubenswrapper[4747]: E0930 19:06:25.164082 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerName="nova-metadata-metadata" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164089 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerName="nova-metadata-metadata" Sep 30 19:06:25 crc kubenswrapper[4747]: E0930 19:06:25.164110 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" containerName="init" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164117 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" containerName="init" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164324 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerName="nova-metadata-metadata" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164344 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" containerName="nova-metadata-log" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164356 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aabdf08-0969-4173-ba76-9e55ba35150a" containerName="nova-manage" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164366 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ca5396-cd5d-4c87-930f-0891e66a5613" containerName="nova-cell1-conductor-db-sync" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.164384 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" containerName="dnsmasq-dns" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.165124 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.175406 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.177518 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.242032 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85fcfdb47c-pn5mf"] Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.248151 4747 scope.go:117] "RemoveContainer" containerID="c578be2151e70e7ae5c5dbbc2e5ce3c2cdb49fb851e8a025e80b514d0a150885" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.249249 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85fcfdb47c-pn5mf"] Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.261858 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7296927d-605d-4885-a57a-d489d59a2bb6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.262014 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7296927d-605d-4885-a57a-d489d59a2bb6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.262072 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m64s\" (UniqueName: \"kubernetes.io/projected/7296927d-605d-4885-a57a-d489d59a2bb6-kube-api-access-2m64s\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.286042 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.297636 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.299801 4747 scope.go:117] "RemoveContainer" containerID="7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.305969 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.307426 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.309713 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.310003 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.327892 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.334588 4747 scope.go:117] "RemoveContainer" containerID="7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.356542 4747 scope.go:117] "RemoveContainer" containerID="7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0" Sep 30 19:06:25 crc kubenswrapper[4747]: E0930 19:06:25.357035 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0\": container with ID starting with 7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0 not found: ID does not exist" containerID="7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.357073 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0"} err="failed to get container status \"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0\": rpc error: code = NotFound desc = could not find container \"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0\": container with ID starting with 7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0 not found: ID does not exist" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.357099 4747 scope.go:117] "RemoveContainer" containerID="7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84" Sep 30 19:06:25 crc kubenswrapper[4747]: E0930 19:06:25.357711 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84\": container with ID starting with 7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84 not found: ID does not exist" containerID="7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.357759 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84"} err="failed to get container status \"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84\": rpc error: code = NotFound desc = could not find container \"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84\": container with ID starting with 7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84 not found: ID does not exist" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.357773 4747 scope.go:117] "RemoveContainer" containerID="7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.358551 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0"} err="failed to get container status \"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0\": rpc error: code = NotFound desc = could not find container \"7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0\": container with ID starting with 7c3120dfd895831a25b79a8f8dd8fbab66b45325fc9bfd6442ec2e2912d13fe0 not found: ID does not exist" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.358569 4747 scope.go:117] "RemoveContainer" containerID="7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.360476 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84"} err="failed to get container status \"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84\": rpc error: code = NotFound desc = could not find container \"7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84\": container with ID starting with 7dfdb60bfe544dd816ba305c8b0059de37f5824c1139b094ea42148ea506dc84 not found: ID does not exist" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.364045 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7296927d-605d-4885-a57a-d489d59a2bb6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.364080 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ece23be-fca5-4627-baf4-d25857fb7570-logs\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.364317 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjxjc\" (UniqueName: \"kubernetes.io/projected/1ece23be-fca5-4627-baf4-d25857fb7570-kube-api-access-cjxjc\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.364346 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.364376 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m64s\" (UniqueName: \"kubernetes.io/projected/7296927d-605d-4885-a57a-d489d59a2bb6-kube-api-access-2m64s\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.364424 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-config-data\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.364471 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.364490 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7296927d-605d-4885-a57a-d489d59a2bb6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.369408 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7296927d-605d-4885-a57a-d489d59a2bb6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.382819 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7296927d-605d-4885-a57a-d489d59a2bb6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.383908 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m64s\" (UniqueName: \"kubernetes.io/projected/7296927d-605d-4885-a57a-d489d59a2bb6-kube-api-access-2m64s\") pod \"nova-cell1-conductor-0\" (UID: \"7296927d-605d-4885-a57a-d489d59a2bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.466146 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.466235 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ece23be-fca5-4627-baf4-d25857fb7570-logs\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.466269 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxjc\" (UniqueName: \"kubernetes.io/projected/1ece23be-fca5-4627-baf4-d25857fb7570-kube-api-access-cjxjc\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.466292 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.466341 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-config-data\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.467151 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ece23be-fca5-4627-baf4-d25857fb7570-logs\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.470036 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.470470 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-config-data\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.470913 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.485351 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjxjc\" (UniqueName: \"kubernetes.io/projected/1ece23be-fca5-4627-baf4-d25857fb7570-kube-api-access-cjxjc\") pod \"nova-metadata-0\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.559258 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.633577 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:06:25 crc kubenswrapper[4747]: I0930 19:06:25.833907 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 19:06:26 crc kubenswrapper[4747]: I0930 19:06:26.152766 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7296927d-605d-4885-a57a-d489d59a2bb6","Type":"ContainerStarted","Data":"e2635476ce9652c06258e992e2a53fa67fe5e4d29a16520273f50224d61a9aa9"} Sep 30 19:06:26 crc kubenswrapper[4747]: I0930 19:06:26.153149 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7296927d-605d-4885-a57a-d489d59a2bb6","Type":"ContainerStarted","Data":"5c16b96811ae938772e1f0c4cf03845cb6585b2a6315d45e835fa3f7e2705cbb"} Sep 30 19:06:26 crc kubenswrapper[4747]: I0930 19:06:26.154100 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:26 crc kubenswrapper[4747]: I0930 19:06:26.161148 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" containerName="nova-scheduler-scheduler" containerID="cri-o://5cd105345704f18a4d79fda424e5cdf5486cfafeec3e47aeba54817ba027a1c0" gracePeriod=30 Sep 30 19:06:26 crc kubenswrapper[4747]: I0930 19:06:26.186615 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:06:26 crc kubenswrapper[4747]: I0930 19:06:26.191714 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.191698262 podStartE2EDuration="1.191698262s" podCreationTimestamp="2025-09-30 19:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:26.179154662 +0000 UTC m=+1225.838634826" watchObservedRunningTime="2025-09-30 19:06:26.191698262 +0000 UTC m=+1225.851178376" Sep 30 19:06:26 crc kubenswrapper[4747]: W0930 19:06:26.193698 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ece23be_fca5_4627_baf4_d25857fb7570.slice/crio-b125e397baaa2e344483bf8e69ffe0995f5825e367ae099b93f2adce2b5b01e0 WatchSource:0}: Error finding container b125e397baaa2e344483bf8e69ffe0995f5825e367ae099b93f2adce2b5b01e0: Status 404 returned error can't find the container with id b125e397baaa2e344483bf8e69ffe0995f5825e367ae099b93f2adce2b5b01e0 Sep 30 19:06:27 crc kubenswrapper[4747]: I0930 19:06:27.107291 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095a8eef-72de-4bd9-a435-c2b7ef2e832e" path="/var/lib/kubelet/pods/095a8eef-72de-4bd9-a435-c2b7ef2e832e/volumes" Sep 30 19:06:27 crc kubenswrapper[4747]: I0930 19:06:27.108845 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d374a6-59e4-4435-b35c-ad5358e8d507" path="/var/lib/kubelet/pods/69d374a6-59e4-4435-b35c-ad5358e8d507/volumes" Sep 30 19:06:27 crc kubenswrapper[4747]: I0930 19:06:27.180345 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ece23be-fca5-4627-baf4-d25857fb7570","Type":"ContainerStarted","Data":"7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288"} Sep 30 19:06:27 crc kubenswrapper[4747]: I0930 19:06:27.180398 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ece23be-fca5-4627-baf4-d25857fb7570","Type":"ContainerStarted","Data":"fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3"} Sep 30 19:06:27 crc kubenswrapper[4747]: I0930 19:06:27.180422 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ece23be-fca5-4627-baf4-d25857fb7570","Type":"ContainerStarted","Data":"b125e397baaa2e344483bf8e69ffe0995f5825e367ae099b93f2adce2b5b01e0"} Sep 30 19:06:27 crc kubenswrapper[4747]: I0930 19:06:27.239494 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.239451882 podStartE2EDuration="2.239451882s" podCreationTimestamp="2025-09-30 19:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:27.220383745 +0000 UTC m=+1226.879863929" watchObservedRunningTime="2025-09-30 19:06:27.239451882 +0000 UTC m=+1226.898932036" Sep 30 19:06:28 crc kubenswrapper[4747]: E0930 19:06:28.484047 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5cd105345704f18a4d79fda424e5cdf5486cfafeec3e47aeba54817ba027a1c0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 19:06:28 crc kubenswrapper[4747]: E0930 19:06:28.489431 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5cd105345704f18a4d79fda424e5cdf5486cfafeec3e47aeba54817ba027a1c0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 19:06:28 crc kubenswrapper[4747]: E0930 19:06:28.491847 4747 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5cd105345704f18a4d79fda424e5cdf5486cfafeec3e47aeba54817ba027a1c0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 19:06:28 crc kubenswrapper[4747]: E0930 19:06:28.491961 4747 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" containerName="nova-scheduler-scheduler" Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.208038 4747 generic.go:334] "Generic (PLEG): container finished" podID="be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" containerID="5cd105345704f18a4d79fda424e5cdf5486cfafeec3e47aeba54817ba027a1c0" exitCode=0 Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.208236 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3","Type":"ContainerDied","Data":"5cd105345704f18a4d79fda424e5cdf5486cfafeec3e47aeba54817ba027a1c0"} Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.569508 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.641548 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-config-data\") pod \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.641619 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-combined-ca-bundle\") pod \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.641663 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkxnp\" (UniqueName: \"kubernetes.io/projected/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-kube-api-access-vkxnp\") pod \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\" (UID: \"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3\") " Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.665331 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-kube-api-access-vkxnp" (OuterVolumeSpecName: "kube-api-access-vkxnp") pod "be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" (UID: "be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3"). InnerVolumeSpecName "kube-api-access-vkxnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.675159 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" (UID: "be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.692509 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-config-data" (OuterVolumeSpecName: "config-data") pod "be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" (UID: "be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.744786 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.744817 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:29 crc kubenswrapper[4747]: I0930 19:06:29.744834 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkxnp\" (UniqueName: \"kubernetes.io/projected/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3-kube-api-access-vkxnp\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.060498 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.152254 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-config-data\") pod \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.152468 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-combined-ca-bundle\") pod \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.152538 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-logs\") pod \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.152677 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx7r2\" (UniqueName: \"kubernetes.io/projected/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-kube-api-access-rx7r2\") pod \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\" (UID: \"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3\") " Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.154884 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-logs" (OuterVolumeSpecName: "logs") pod "cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" (UID: "cbc27be3-fbf3-4b13-bc3d-896782fa5ef3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.156535 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-kube-api-access-rx7r2" (OuterVolumeSpecName: "kube-api-access-rx7r2") pod "cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" (UID: "cbc27be3-fbf3-4b13-bc3d-896782fa5ef3"). InnerVolumeSpecName "kube-api-access-rx7r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.182873 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-config-data" (OuterVolumeSpecName: "config-data") pod "cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" (UID: "cbc27be3-fbf3-4b13-bc3d-896782fa5ef3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.186497 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" (UID: "cbc27be3-fbf3-4b13-bc3d-896782fa5ef3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.220498 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.220522 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3","Type":"ContainerDied","Data":"d48f2d9e5325413497f58077afc94f0c06997f001737ffeda048fd71e9a42201"} Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.220589 4747 scope.go:117] "RemoveContainer" containerID="5cd105345704f18a4d79fda424e5cdf5486cfafeec3e47aeba54817ba027a1c0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.225754 4747 generic.go:334] "Generic (PLEG): container finished" podID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerID="fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11" exitCode=0 Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.225816 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3","Type":"ContainerDied","Data":"fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11"} Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.225858 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc27be3-fbf3-4b13-bc3d-896782fa5ef3","Type":"ContainerDied","Data":"89b7fa9ea706527b8edd4650b6c832a1d385c334386360180854ab0511d59d82"} Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.225980 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.257220 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx7r2\" (UniqueName: \"kubernetes.io/projected/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-kube-api-access-rx7r2\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.257275 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.257296 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.257314 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.299869 4747 scope.go:117] "RemoveContainer" containerID="fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.303852 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.331242 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.338189 4747 scope.go:117] "RemoveContainer" containerID="f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.339554 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.348222 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.366877 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:30 crc kubenswrapper[4747]: E0930 19:06:30.367269 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" containerName="nova-scheduler-scheduler" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.367289 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" containerName="nova-scheduler-scheduler" Sep 30 19:06:30 crc kubenswrapper[4747]: E0930 19:06:30.367306 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-log" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.367312 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-log" Sep 30 19:06:30 crc kubenswrapper[4747]: E0930 19:06:30.367339 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-api" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.367345 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-api" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.367499 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" containerName="nova-scheduler-scheduler" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.367510 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-log" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.367524 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" containerName="nova-api-api" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.370713 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.371955 4747 scope.go:117] "RemoveContainer" containerID="fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.372665 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.377300 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:30 crc kubenswrapper[4747]: E0930 19:06:30.379307 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11\": container with ID starting with fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11 not found: ID does not exist" containerID="fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.379346 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11"} err="failed to get container status \"fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11\": rpc error: code = NotFound desc = could not find container \"fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11\": container with ID starting with fc5f182737292f1f9a54dee752e1a82a2e5077f520465405897755b7528e5a11 not found: ID does not exist" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.379372 4747 scope.go:117] "RemoveContainer" containerID="f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120" Sep 30 19:06:30 crc kubenswrapper[4747]: E0930 19:06:30.380003 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120\": container with ID starting with f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120 not found: ID does not exist" containerID="f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.380027 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120"} err="failed to get container status \"f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120\": rpc error: code = NotFound desc = could not find container \"f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120\": container with ID starting with f07b3d0a9fe49f28fba201766ab6ba7c9e9e39c46630ccb626807abe3761e120 not found: ID does not exist" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.386523 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.388512 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.390415 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.412332 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.463626 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-config-data\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.463676 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhdxx\" (UniqueName: \"kubernetes.io/projected/a493d172-e7e2-4ab6-984a-1e638a445f46-kube-api-access-rhdxx\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.463723 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbl6n\" (UniqueName: \"kubernetes.io/projected/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-kube-api-access-zbl6n\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.463753 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-logs\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.463875 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-config-data\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.464045 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.464250 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.565580 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-config-data\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.565648 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.565714 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.565788 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-config-data\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.566317 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhdxx\" (UniqueName: \"kubernetes.io/projected/a493d172-e7e2-4ab6-984a-1e638a445f46-kube-api-access-rhdxx\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.566371 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbl6n\" (UniqueName: \"kubernetes.io/projected/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-kube-api-access-zbl6n\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.566397 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-logs\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.566710 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-logs\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.569995 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.570807 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.571540 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-config-data\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.573328 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-config-data\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.587584 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbl6n\" (UniqueName: \"kubernetes.io/projected/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-kube-api-access-zbl6n\") pod \"nova-api-0\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " pod="openstack/nova-api-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.597159 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhdxx\" (UniqueName: \"kubernetes.io/projected/a493d172-e7e2-4ab6-984a-1e638a445f46-kube-api-access-rhdxx\") pod \"nova-scheduler-0\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.634378 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.634714 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.701227 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:06:30 crc kubenswrapper[4747]: I0930 19:06:30.712691 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:31 crc kubenswrapper[4747]: I0930 19:06:31.116014 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3" path="/var/lib/kubelet/pods/be5e1f4d-ed6a-4e0b-b5b0-87019d7260f3/volumes" Sep 30 19:06:31 crc kubenswrapper[4747]: I0930 19:06:31.116766 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc27be3-fbf3-4b13-bc3d-896782fa5ef3" path="/var/lib/kubelet/pods/cbc27be3-fbf3-4b13-bc3d-896782fa5ef3/volumes" Sep 30 19:06:31 crc kubenswrapper[4747]: I0930 19:06:31.238909 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:06:31 crc kubenswrapper[4747]: I0930 19:06:31.323082 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:31 crc kubenswrapper[4747]: W0930 19:06:31.335865 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb51e10_57c3_47fb_9ed5_cd78033ec5e2.slice/crio-69dd9e057b75087b960c7ff8956c941909ff8b491df297887bd2bfbc588f1017 WatchSource:0}: Error finding container 69dd9e057b75087b960c7ff8956c941909ff8b491df297887bd2bfbc588f1017: Status 404 returned error can't find the container with id 69dd9e057b75087b960c7ff8956c941909ff8b491df297887bd2bfbc588f1017 Sep 30 19:06:32 crc kubenswrapper[4747]: I0930 19:06:32.266325 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a493d172-e7e2-4ab6-984a-1e638a445f46","Type":"ContainerStarted","Data":"f7565c76da7c9d3a4fb05b4ac4c84d07f2e4a54b67d751d53ed5df70fe85fc2d"} Sep 30 19:06:32 crc kubenswrapper[4747]: I0930 19:06:32.267082 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a493d172-e7e2-4ab6-984a-1e638a445f46","Type":"ContainerStarted","Data":"bbedd07ad5b08e9b357467792842959011b7f8add7219f211a372bab5be75557"} Sep 30 19:06:32 crc kubenswrapper[4747]: I0930 19:06:32.269406 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"feb51e10-57c3-47fb-9ed5-cd78033ec5e2","Type":"ContainerStarted","Data":"56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027"} Sep 30 19:06:32 crc kubenswrapper[4747]: I0930 19:06:32.269467 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"feb51e10-57c3-47fb-9ed5-cd78033ec5e2","Type":"ContainerStarted","Data":"df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde"} Sep 30 19:06:32 crc kubenswrapper[4747]: I0930 19:06:32.269487 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"feb51e10-57c3-47fb-9ed5-cd78033ec5e2","Type":"ContainerStarted","Data":"69dd9e057b75087b960c7ff8956c941909ff8b491df297887bd2bfbc588f1017"} Sep 30 19:06:32 crc kubenswrapper[4747]: I0930 19:06:32.286863 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.286845945 podStartE2EDuration="2.286845945s" podCreationTimestamp="2025-09-30 19:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:32.285395093 +0000 UTC m=+1231.944875207" watchObservedRunningTime="2025-09-30 19:06:32.286845945 +0000 UTC m=+1231.946326049" Sep 30 19:06:32 crc kubenswrapper[4747]: I0930 19:06:32.323777 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.323752505 podStartE2EDuration="2.323752505s" podCreationTimestamp="2025-09-30 19:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:32.313691386 +0000 UTC m=+1231.973171580" watchObservedRunningTime="2025-09-30 19:06:32.323752505 +0000 UTC m=+1231.983232659" Sep 30 19:06:35 crc kubenswrapper[4747]: I0930 19:06:35.594756 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 19:06:35 crc kubenswrapper[4747]: I0930 19:06:35.634163 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 19:06:35 crc kubenswrapper[4747]: I0930 19:06:35.634220 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 19:06:35 crc kubenswrapper[4747]: I0930 19:06:35.702164 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 19:06:36 crc kubenswrapper[4747]: I0930 19:06:36.685150 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.172:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:06:36 crc kubenswrapper[4747]: I0930 19:06:36.685372 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.172:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:06:40 crc kubenswrapper[4747]: I0930 19:06:40.702161 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 19:06:40 crc kubenswrapper[4747]: I0930 19:06:40.712997 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:06:40 crc kubenswrapper[4747]: I0930 19:06:40.713089 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:06:40 crc kubenswrapper[4747]: I0930 19:06:40.752119 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 19:06:41 crc kubenswrapper[4747]: I0930 19:06:41.397608 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 19:06:41 crc kubenswrapper[4747]: I0930 19:06:41.801079 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.174:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:06:41 crc kubenswrapper[4747]: I0930 19:06:41.802166 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.174:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:06:45 crc kubenswrapper[4747]: I0930 19:06:45.642354 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 19:06:45 crc kubenswrapper[4747]: I0930 19:06:45.648728 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 19:06:45 crc kubenswrapper[4747]: I0930 19:06:45.649678 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 19:06:46 crc kubenswrapper[4747]: I0930 19:06:46.429226 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.441215 4747 generic.go:334] "Generic (PLEG): container finished" podID="19e0ae2f-1fde-42db-8386-4ebbe8dcd883" containerID="fe38fd93c43851379580425b30ed9e4910031c92756ba0b704ebaf573ed1c8aa" exitCode=137 Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.441416 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"19e0ae2f-1fde-42db-8386-4ebbe8dcd883","Type":"ContainerDied","Data":"fe38fd93c43851379580425b30ed9e4910031c92756ba0b704ebaf573ed1c8aa"} Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.442046 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"19e0ae2f-1fde-42db-8386-4ebbe8dcd883","Type":"ContainerDied","Data":"4f943d465213c9ada3732a358593b465881a8c455bd3148ca8403cee557944f7"} Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.442087 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f943d465213c9ada3732a358593b465881a8c455bd3148ca8403cee557944f7" Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.493722 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.527288 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-combined-ca-bundle\") pod \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.527413 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgzq6\" (UniqueName: \"kubernetes.io/projected/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-kube-api-access-bgzq6\") pod \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.527467 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-config-data\") pod \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\" (UID: \"19e0ae2f-1fde-42db-8386-4ebbe8dcd883\") " Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.537392 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-kube-api-access-bgzq6" (OuterVolumeSpecName: "kube-api-access-bgzq6") pod "19e0ae2f-1fde-42db-8386-4ebbe8dcd883" (UID: "19e0ae2f-1fde-42db-8386-4ebbe8dcd883"). InnerVolumeSpecName "kube-api-access-bgzq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.567678 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19e0ae2f-1fde-42db-8386-4ebbe8dcd883" (UID: "19e0ae2f-1fde-42db-8386-4ebbe8dcd883"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.575491 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-config-data" (OuterVolumeSpecName: "config-data") pod "19e0ae2f-1fde-42db-8386-4ebbe8dcd883" (UID: "19e0ae2f-1fde-42db-8386-4ebbe8dcd883"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.629895 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.630007 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgzq6\" (UniqueName: \"kubernetes.io/projected/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-kube-api-access-bgzq6\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:48 crc kubenswrapper[4747]: I0930 19:06:48.630029 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e0ae2f-1fde-42db-8386-4ebbe8dcd883-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.453458 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.491494 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.509495 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.526107 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:49 crc kubenswrapper[4747]: E0930 19:06:49.526692 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e0ae2f-1fde-42db-8386-4ebbe8dcd883" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.526722 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e0ae2f-1fde-42db-8386-4ebbe8dcd883" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.527073 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e0ae2f-1fde-42db-8386-4ebbe8dcd883" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.528079 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.530852 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.531609 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.536507 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.547005 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.650609 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.651050 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.651111 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.651339 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65rjr\" (UniqueName: \"kubernetes.io/projected/ac0b1b32-053b-4384-81a6-d02aa6a15e65-kube-api-access-65rjr\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.651433 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.753216 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.753287 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.753460 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65rjr\" (UniqueName: \"kubernetes.io/projected/ac0b1b32-053b-4384-81a6-d02aa6a15e65-kube-api-access-65rjr\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.753544 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.753646 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.759609 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.759731 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.760121 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.762755 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0b1b32-053b-4384-81a6-d02aa6a15e65-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.787751 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65rjr\" (UniqueName: \"kubernetes.io/projected/ac0b1b32-053b-4384-81a6-d02aa6a15e65-kube-api-access-65rjr\") pod \"nova-cell1-novncproxy-0\" (UID: \"ac0b1b32-053b-4384-81a6-d02aa6a15e65\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:49 crc kubenswrapper[4747]: I0930 19:06:49.852741 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:50 crc kubenswrapper[4747]: I0930 19:06:50.146394 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:06:50 crc kubenswrapper[4747]: I0930 19:06:50.468283 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ac0b1b32-053b-4384-81a6-d02aa6a15e65","Type":"ContainerStarted","Data":"a98ab9eca5d2004d52198cb8b0ceb9db4265f333fac0cdc4537bd731ea479410"} Sep 30 19:06:50 crc kubenswrapper[4747]: I0930 19:06:50.468707 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ac0b1b32-053b-4384-81a6-d02aa6a15e65","Type":"ContainerStarted","Data":"c3a11042c985f5dcfec09da103f3a7e6003827f8eed869ce79c36fbd41135acb"} Sep 30 19:06:50 crc kubenswrapper[4747]: I0930 19:06:50.500669 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.500639914 podStartE2EDuration="1.500639914s" podCreationTimestamp="2025-09-30 19:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:50.489784032 +0000 UTC m=+1250.149264186" watchObservedRunningTime="2025-09-30 19:06:50.500639914 +0000 UTC m=+1250.160120038" Sep 30 19:06:50 crc kubenswrapper[4747]: I0930 19:06:50.719028 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 19:06:50 crc kubenswrapper[4747]: I0930 19:06:50.720162 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 19:06:50 crc kubenswrapper[4747]: I0930 19:06:50.721034 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 19:06:50 crc kubenswrapper[4747]: I0930 19:06:50.725956 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.103393 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e0ae2f-1fde-42db-8386-4ebbe8dcd883" path="/var/lib/kubelet/pods/19e0ae2f-1fde-42db-8386-4ebbe8dcd883/volumes" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.480771 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.485948 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.715078 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bd94c88c7-zk8hh"] Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.722039 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.730634 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bd94c88c7-zk8hh"] Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.797280 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.797375 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-dns-svc\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.797403 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.797473 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-config\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.797493 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s482\" (UniqueName: \"kubernetes.io/projected/7207174e-ce90-4450-8dac-9d434a26d7ae-kube-api-access-4s482\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.900236 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-config\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.900287 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s482\" (UniqueName: \"kubernetes.io/projected/7207174e-ce90-4450-8dac-9d434a26d7ae-kube-api-access-4s482\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.900379 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.900438 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-dns-svc\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.900475 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.901335 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.901829 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-config\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.902685 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-dns-svc\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.902965 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7207174e-ce90-4450-8dac-9d434a26d7ae-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:51 crc kubenswrapper[4747]: I0930 19:06:51.930769 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s482\" (UniqueName: \"kubernetes.io/projected/7207174e-ce90-4450-8dac-9d434a26d7ae-kube-api-access-4s482\") pod \"dnsmasq-dns-6bd94c88c7-zk8hh\" (UID: \"7207174e-ce90-4450-8dac-9d434a26d7ae\") " pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:52 crc kubenswrapper[4747]: I0930 19:06:52.046014 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:52 crc kubenswrapper[4747]: I0930 19:06:52.511359 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bd94c88c7-zk8hh"] Sep 30 19:06:52 crc kubenswrapper[4747]: W0930 19:06:52.512048 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7207174e_ce90_4450_8dac_9d434a26d7ae.slice/crio-1c8f602d6c511c34c67deaa956f4902d9dfe8900268694d14602b15eb869fc88 WatchSource:0}: Error finding container 1c8f602d6c511c34c67deaa956f4902d9dfe8900268694d14602b15eb869fc88: Status 404 returned error can't find the container with id 1c8f602d6c511c34c67deaa956f4902d9dfe8900268694d14602b15eb869fc88 Sep 30 19:06:53 crc kubenswrapper[4747]: I0930 19:06:53.494913 4747 generic.go:334] "Generic (PLEG): container finished" podID="7207174e-ce90-4450-8dac-9d434a26d7ae" containerID="0bc91a1d90f929927f3aec890fbdc25eb2edf517f8d299e5cadbd60a769f8e81" exitCode=0 Sep 30 19:06:53 crc kubenswrapper[4747]: I0930 19:06:53.494998 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" event={"ID":"7207174e-ce90-4450-8dac-9d434a26d7ae","Type":"ContainerDied","Data":"0bc91a1d90f929927f3aec890fbdc25eb2edf517f8d299e5cadbd60a769f8e81"} Sep 30 19:06:53 crc kubenswrapper[4747]: I0930 19:06:53.495391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" event={"ID":"7207174e-ce90-4450-8dac-9d434a26d7ae","Type":"ContainerStarted","Data":"1c8f602d6c511c34c67deaa956f4902d9dfe8900268694d14602b15eb869fc88"} Sep 30 19:06:54 crc kubenswrapper[4747]: I0930 19:06:54.211784 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:54 crc kubenswrapper[4747]: I0930 19:06:54.506391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" event={"ID":"7207174e-ce90-4450-8dac-9d434a26d7ae","Type":"ContainerStarted","Data":"9b61d7f4a6a694c94ce3e0626740fda4da388242cd0db60add456b1132fb9503"} Sep 30 19:06:54 crc kubenswrapper[4747]: I0930 19:06:54.506564 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:06:54 crc kubenswrapper[4747]: I0930 19:06:54.507043 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-log" containerID="cri-o://df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde" gracePeriod=30 Sep 30 19:06:54 crc kubenswrapper[4747]: I0930 19:06:54.507097 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-api" containerID="cri-o://56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027" gracePeriod=30 Sep 30 19:06:54 crc kubenswrapper[4747]: I0930 19:06:54.532518 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" podStartSLOduration=3.53249329 podStartE2EDuration="3.53249329s" podCreationTimestamp="2025-09-30 19:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:06:54.530522534 +0000 UTC m=+1254.190002678" watchObservedRunningTime="2025-09-30 19:06:54.53249329 +0000 UTC m=+1254.191973414" Sep 30 19:06:54 crc kubenswrapper[4747]: I0930 19:06:54.853709 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:55 crc kubenswrapper[4747]: I0930 19:06:55.517706 4747 generic.go:334] "Generic (PLEG): container finished" podID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerID="df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde" exitCode=143 Sep 30 19:06:55 crc kubenswrapper[4747]: I0930 19:06:55.517767 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"feb51e10-57c3-47fb-9ed5-cd78033ec5e2","Type":"ContainerDied","Data":"df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde"} Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.155906 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.218323 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-config-data\") pod \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.218371 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-combined-ca-bundle\") pod \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.218427 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbl6n\" (UniqueName: \"kubernetes.io/projected/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-kube-api-access-zbl6n\") pod \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.218503 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-logs\") pod \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\" (UID: \"feb51e10-57c3-47fb-9ed5-cd78033ec5e2\") " Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.219604 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-logs" (OuterVolumeSpecName: "logs") pod "feb51e10-57c3-47fb-9ed5-cd78033ec5e2" (UID: "feb51e10-57c3-47fb-9ed5-cd78033ec5e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.226157 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-kube-api-access-zbl6n" (OuterVolumeSpecName: "kube-api-access-zbl6n") pod "feb51e10-57c3-47fb-9ed5-cd78033ec5e2" (UID: "feb51e10-57c3-47fb-9ed5-cd78033ec5e2"). InnerVolumeSpecName "kube-api-access-zbl6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.248197 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-config-data" (OuterVolumeSpecName: "config-data") pod "feb51e10-57c3-47fb-9ed5-cd78033ec5e2" (UID: "feb51e10-57c3-47fb-9ed5-cd78033ec5e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.255660 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feb51e10-57c3-47fb-9ed5-cd78033ec5e2" (UID: "feb51e10-57c3-47fb-9ed5-cd78033ec5e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.324187 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.324267 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.324280 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbl6n\" (UniqueName: \"kubernetes.io/projected/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-kube-api-access-zbl6n\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.324291 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb51e10-57c3-47fb-9ed5-cd78033ec5e2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.569842 4747 generic.go:334] "Generic (PLEG): container finished" podID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerID="56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027" exitCode=0 Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.570851 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"feb51e10-57c3-47fb-9ed5-cd78033ec5e2","Type":"ContainerDied","Data":"56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027"} Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.570908 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"feb51e10-57c3-47fb-9ed5-cd78033ec5e2","Type":"ContainerDied","Data":"69dd9e057b75087b960c7ff8956c941909ff8b491df297887bd2bfbc588f1017"} Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.570963 4747 scope.go:117] "RemoveContainer" containerID="56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.570997 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.595619 4747 scope.go:117] "RemoveContainer" containerID="df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.623034 4747 scope.go:117] "RemoveContainer" containerID="56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027" Sep 30 19:06:58 crc kubenswrapper[4747]: E0930 19:06:58.623729 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027\": container with ID starting with 56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027 not found: ID does not exist" containerID="56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.623797 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027"} err="failed to get container status \"56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027\": rpc error: code = NotFound desc = could not find container \"56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027\": container with ID starting with 56150fc9cf20957837d95fbb9bbfd9b649d6795fd29c16d5694a0852c84a3027 not found: ID does not exist" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.623839 4747 scope.go:117] "RemoveContainer" containerID="df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde" Sep 30 19:06:58 crc kubenswrapper[4747]: E0930 19:06:58.624437 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde\": container with ID starting with df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde not found: ID does not exist" containerID="df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.624518 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde"} err="failed to get container status \"df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde\": rpc error: code = NotFound desc = could not find container \"df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde\": container with ID starting with df829c155ff860ccc17670dba7b66b55381b1ebac95f5b53eb8eeaeca6e30fde not found: ID does not exist" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.639095 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.655851 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.667474 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:58 crc kubenswrapper[4747]: E0930 19:06:58.668029 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-log" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.668056 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-log" Sep 30 19:06:58 crc kubenswrapper[4747]: E0930 19:06:58.668076 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-api" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.668084 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-api" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.668327 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-log" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.668350 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" containerName="nova-api-api" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.669550 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.674178 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.675125 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.675330 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.678450 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.731762 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.731884 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ceda77-074b-4930-97bd-49b9a73b217e-logs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.732015 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2pw6\" (UniqueName: \"kubernetes.io/projected/44ceda77-074b-4930-97bd-49b9a73b217e-kube-api-access-d2pw6\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.732424 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-config-data\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.732608 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-public-tls-certs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.732774 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.835224 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-public-tls-certs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.835462 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.835545 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.835676 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ceda77-074b-4930-97bd-49b9a73b217e-logs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.835740 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2pw6\" (UniqueName: \"kubernetes.io/projected/44ceda77-074b-4930-97bd-49b9a73b217e-kube-api-access-d2pw6\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.836030 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-config-data\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.836991 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ceda77-074b-4930-97bd-49b9a73b217e-logs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.841770 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-public-tls-certs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.842720 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.843882 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.844588 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-config-data\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.865767 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2pw6\" (UniqueName: \"kubernetes.io/projected/44ceda77-074b-4930-97bd-49b9a73b217e-kube-api-access-d2pw6\") pod \"nova-api-0\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " pod="openstack/nova-api-0" Sep 30 19:06:58 crc kubenswrapper[4747]: I0930 19:06:58.997392 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:06:59 crc kubenswrapper[4747]: I0930 19:06:59.102863 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb51e10-57c3-47fb-9ed5-cd78033ec5e2" path="/var/lib/kubelet/pods/feb51e10-57c3-47fb-9ed5-cd78033ec5e2/volumes" Sep 30 19:06:59 crc kubenswrapper[4747]: I0930 19:06:59.301501 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:06:59 crc kubenswrapper[4747]: I0930 19:06:59.586223 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"44ceda77-074b-4930-97bd-49b9a73b217e","Type":"ContainerStarted","Data":"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab"} Sep 30 19:06:59 crc kubenswrapper[4747]: I0930 19:06:59.586280 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"44ceda77-074b-4930-97bd-49b9a73b217e","Type":"ContainerStarted","Data":"b0576f2355b491c7a0cc0d37714ed30301fa52b8e2793d271fa7378efc8abb26"} Sep 30 19:06:59 crc kubenswrapper[4747]: I0930 19:06:59.854560 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:06:59 crc kubenswrapper[4747]: I0930 19:06:59.887148 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.602213 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"44ceda77-074b-4930-97bd-49b9a73b217e","Type":"ContainerStarted","Data":"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4"} Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.636030 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.655231 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.655203195 podStartE2EDuration="2.655203195s" podCreationTimestamp="2025-09-30 19:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:07:00.627482519 +0000 UTC m=+1260.286962653" watchObservedRunningTime="2025-09-30 19:07:00.655203195 +0000 UTC m=+1260.314683349" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.825658 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7nkhl"] Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.827277 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.831144 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.831357 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.832866 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7nkhl"] Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.888782 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzq5v\" (UniqueName: \"kubernetes.io/projected/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-kube-api-access-jzq5v\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.888845 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-config-data\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.888886 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.889004 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-scripts\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.992649 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-scripts\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.994049 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzq5v\" (UniqueName: \"kubernetes.io/projected/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-kube-api-access-jzq5v\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.994664 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-config-data\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.994756 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.996201 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 19:07:00 crc kubenswrapper[4747]: I0930 19:07:00.996432 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 19:07:01 crc kubenswrapper[4747]: I0930 19:07:01.001358 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:01 crc kubenswrapper[4747]: I0930 19:07:01.011715 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-scripts\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:01 crc kubenswrapper[4747]: I0930 19:07:01.016060 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-config-data\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:01 crc kubenswrapper[4747]: I0930 19:07:01.016305 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzq5v\" (UniqueName: \"kubernetes.io/projected/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-kube-api-access-jzq5v\") pod \"nova-cell1-cell-mapping-7nkhl\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:01 crc kubenswrapper[4747]: I0930 19:07:01.152178 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:01 crc kubenswrapper[4747]: I0930 19:07:01.694654 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7nkhl"] Sep 30 19:07:01 crc kubenswrapper[4747]: W0930 19:07:01.697566 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3cf400_e562_4174_ac4a_7deb1d8d0be5.slice/crio-73ae68462108349328936354aa05125190545a5f68801d3b662ebd609a8752d9 WatchSource:0}: Error finding container 73ae68462108349328936354aa05125190545a5f68801d3b662ebd609a8752d9: Status 404 returned error can't find the container with id 73ae68462108349328936354aa05125190545a5f68801d3b662ebd609a8752d9 Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.047798 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bd94c88c7-zk8hh" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.109705 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbc9bd96f-m9fjq"] Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.109949 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" podUID="7322c2b5-665e-4f0f-9680-002ba350622c" containerName="dnsmasq-dns" containerID="cri-o://4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270" gracePeriod=10 Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.575447 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.629517 4747 generic.go:334] "Generic (PLEG): container finished" podID="7322c2b5-665e-4f0f-9680-002ba350622c" containerID="4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270" exitCode=0 Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.629640 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" event={"ID":"7322c2b5-665e-4f0f-9680-002ba350622c","Type":"ContainerDied","Data":"4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270"} Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.629686 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" event={"ID":"7322c2b5-665e-4f0f-9680-002ba350622c","Type":"ContainerDied","Data":"efd414a9ef4c673ebdab4361be5476a28f45a0d4f33507f325e00be55809e2e6"} Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.629703 4747 scope.go:117] "RemoveContainer" containerID="4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.629870 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbc9bd96f-m9fjq" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.632951 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7nkhl" event={"ID":"9b3cf400-e562-4174-ac4a-7deb1d8d0be5","Type":"ContainerStarted","Data":"5f098f8104e59d3a9a4d6f5f6f08a5607f23681c771af810c45716cd2a121dca"} Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.632999 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7nkhl" event={"ID":"9b3cf400-e562-4174-ac4a-7deb1d8d0be5","Type":"ContainerStarted","Data":"73ae68462108349328936354aa05125190545a5f68801d3b662ebd609a8752d9"} Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.642274 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-sb\") pod \"7322c2b5-665e-4f0f-9680-002ba350622c\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.642333 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-dns-svc\") pod \"7322c2b5-665e-4f0f-9680-002ba350622c\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.642382 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-nb\") pod \"7322c2b5-665e-4f0f-9680-002ba350622c\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.642429 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j46tm\" (UniqueName: \"kubernetes.io/projected/7322c2b5-665e-4f0f-9680-002ba350622c-kube-api-access-j46tm\") pod \"7322c2b5-665e-4f0f-9680-002ba350622c\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.642592 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-config\") pod \"7322c2b5-665e-4f0f-9680-002ba350622c\" (UID: \"7322c2b5-665e-4f0f-9680-002ba350622c\") " Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.664139 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7nkhl" podStartSLOduration=2.664118757 podStartE2EDuration="2.664118757s" podCreationTimestamp="2025-09-30 19:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:07:02.652954417 +0000 UTC m=+1262.312434531" watchObservedRunningTime="2025-09-30 19:07:02.664118757 +0000 UTC m=+1262.323598871" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.666175 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7322c2b5-665e-4f0f-9680-002ba350622c-kube-api-access-j46tm" (OuterVolumeSpecName: "kube-api-access-j46tm") pod "7322c2b5-665e-4f0f-9680-002ba350622c" (UID: "7322c2b5-665e-4f0f-9680-002ba350622c"). InnerVolumeSpecName "kube-api-access-j46tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.666299 4747 scope.go:117] "RemoveContainer" containerID="37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.705719 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-config" (OuterVolumeSpecName: "config") pod "7322c2b5-665e-4f0f-9680-002ba350622c" (UID: "7322c2b5-665e-4f0f-9680-002ba350622c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.707880 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7322c2b5-665e-4f0f-9680-002ba350622c" (UID: "7322c2b5-665e-4f0f-9680-002ba350622c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.708877 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7322c2b5-665e-4f0f-9680-002ba350622c" (UID: "7322c2b5-665e-4f0f-9680-002ba350622c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.710150 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7322c2b5-665e-4f0f-9680-002ba350622c" (UID: "7322c2b5-665e-4f0f-9680-002ba350622c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.724429 4747 scope.go:117] "RemoveContainer" containerID="4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270" Sep 30 19:07:02 crc kubenswrapper[4747]: E0930 19:07:02.724879 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270\": container with ID starting with 4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270 not found: ID does not exist" containerID="4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.724916 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270"} err="failed to get container status \"4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270\": rpc error: code = NotFound desc = could not find container \"4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270\": container with ID starting with 4a133c1f52cf43f5f49fbb88876065cd215fda531d3e5b160334676787cd5270 not found: ID does not exist" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.724985 4747 scope.go:117] "RemoveContainer" containerID="37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0" Sep 30 19:07:02 crc kubenswrapper[4747]: E0930 19:07:02.725366 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0\": container with ID starting with 37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0 not found: ID does not exist" containerID="37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.725395 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0"} err="failed to get container status \"37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0\": rpc error: code = NotFound desc = could not find container \"37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0\": container with ID starting with 37441ac25edbe9a251bd208648593614451f6fda028518531ee2c5597c7d0da0 not found: ID does not exist" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.744775 4747 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.744798 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.744807 4747 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.744815 4747 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7322c2b5-665e-4f0f-9680-002ba350622c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.744823 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j46tm\" (UniqueName: \"kubernetes.io/projected/7322c2b5-665e-4f0f-9680-002ba350622c-kube-api-access-j46tm\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.967297 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbc9bd96f-m9fjq"] Sep 30 19:07:02 crc kubenswrapper[4747]: I0930 19:07:02.973691 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cbc9bd96f-m9fjq"] Sep 30 19:07:03 crc kubenswrapper[4747]: I0930 19:07:03.102158 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7322c2b5-665e-4f0f-9680-002ba350622c" path="/var/lib/kubelet/pods/7322c2b5-665e-4f0f-9680-002ba350622c/volumes" Sep 30 19:07:06 crc kubenswrapper[4747]: I0930 19:07:06.676592 4747 generic.go:334] "Generic (PLEG): container finished" podID="9b3cf400-e562-4174-ac4a-7deb1d8d0be5" containerID="5f098f8104e59d3a9a4d6f5f6f08a5607f23681c771af810c45716cd2a121dca" exitCode=0 Sep 30 19:07:06 crc kubenswrapper[4747]: I0930 19:07:06.676697 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7nkhl" event={"ID":"9b3cf400-e562-4174-ac4a-7deb1d8d0be5","Type":"ContainerDied","Data":"5f098f8104e59d3a9a4d6f5f6f08a5607f23681c771af810c45716cd2a121dca"} Sep 30 19:07:07 crc kubenswrapper[4747]: I0930 19:07:07.655554 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:07:07 crc kubenswrapper[4747]: I0930 19:07:07.656090 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.067022 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.189050 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-scripts\") pod \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.189131 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-config-data\") pod \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.189224 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzq5v\" (UniqueName: \"kubernetes.io/projected/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-kube-api-access-jzq5v\") pod \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.189454 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-combined-ca-bundle\") pod \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\" (UID: \"9b3cf400-e562-4174-ac4a-7deb1d8d0be5\") " Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.198447 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-kube-api-access-jzq5v" (OuterVolumeSpecName: "kube-api-access-jzq5v") pod "9b3cf400-e562-4174-ac4a-7deb1d8d0be5" (UID: "9b3cf400-e562-4174-ac4a-7deb1d8d0be5"). InnerVolumeSpecName "kube-api-access-jzq5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.200566 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-scripts" (OuterVolumeSpecName: "scripts") pod "9b3cf400-e562-4174-ac4a-7deb1d8d0be5" (UID: "9b3cf400-e562-4174-ac4a-7deb1d8d0be5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.222848 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b3cf400-e562-4174-ac4a-7deb1d8d0be5" (UID: "9b3cf400-e562-4174-ac4a-7deb1d8d0be5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.236649 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-config-data" (OuterVolumeSpecName: "config-data") pod "9b3cf400-e562-4174-ac4a-7deb1d8d0be5" (UID: "9b3cf400-e562-4174-ac4a-7deb1d8d0be5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.293447 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.293488 4747 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.293501 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.293512 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzq5v\" (UniqueName: \"kubernetes.io/projected/9b3cf400-e562-4174-ac4a-7deb1d8d0be5-kube-api-access-jzq5v\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.701336 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7nkhl" event={"ID":"9b3cf400-e562-4174-ac4a-7deb1d8d0be5","Type":"ContainerDied","Data":"73ae68462108349328936354aa05125190545a5f68801d3b662ebd609a8752d9"} Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.701396 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ae68462108349328936354aa05125190545a5f68801d3b662ebd609a8752d9" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.701450 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7nkhl" Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.917561 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.917829 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" containerName="nova-api-log" containerID="cri-o://95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab" gracePeriod=30 Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.917964 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" containerName="nova-api-api" containerID="cri-o://7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4" gracePeriod=30 Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.931627 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:07:08 crc kubenswrapper[4747]: I0930 19:07:08.931855 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a493d172-e7e2-4ab6-984a-1e638a445f46" containerName="nova-scheduler-scheduler" containerID="cri-o://f7565c76da7c9d3a4fb05b4ac4c84d07f2e4a54b67d751d53ed5df70fe85fc2d" gracePeriod=30 Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.013773 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.014064 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-log" containerID="cri-o://fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3" gracePeriod=30 Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.014226 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-metadata" containerID="cri-o://7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288" gracePeriod=30 Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.561524 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.713021 4747 generic.go:334] "Generic (PLEG): container finished" podID="a493d172-e7e2-4ab6-984a-1e638a445f46" containerID="f7565c76da7c9d3a4fb05b4ac4c84d07f2e4a54b67d751d53ed5df70fe85fc2d" exitCode=0 Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.713077 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a493d172-e7e2-4ab6-984a-1e638a445f46","Type":"ContainerDied","Data":"f7565c76da7c9d3a4fb05b4ac4c84d07f2e4a54b67d751d53ed5df70fe85fc2d"} Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.720525 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-combined-ca-bundle\") pod \"44ceda77-074b-4930-97bd-49b9a73b217e\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.720576 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-config-data\") pod \"44ceda77-074b-4930-97bd-49b9a73b217e\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.720616 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2pw6\" (UniqueName: \"kubernetes.io/projected/44ceda77-074b-4930-97bd-49b9a73b217e-kube-api-access-d2pw6\") pod \"44ceda77-074b-4930-97bd-49b9a73b217e\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.720665 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-public-tls-certs\") pod \"44ceda77-074b-4930-97bd-49b9a73b217e\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.720684 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-internal-tls-certs\") pod \"44ceda77-074b-4930-97bd-49b9a73b217e\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.720727 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ceda77-074b-4930-97bd-49b9a73b217e-logs\") pod \"44ceda77-074b-4930-97bd-49b9a73b217e\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.721883 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44ceda77-074b-4930-97bd-49b9a73b217e-logs" (OuterVolumeSpecName: "logs") pod "44ceda77-074b-4930-97bd-49b9a73b217e" (UID: "44ceda77-074b-4930-97bd-49b9a73b217e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.722149 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.727200 4747 generic.go:334] "Generic (PLEG): container finished" podID="44ceda77-074b-4930-97bd-49b9a73b217e" containerID="7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4" exitCode=0 Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.727235 4747 generic.go:334] "Generic (PLEG): container finished" podID="44ceda77-074b-4930-97bd-49b9a73b217e" containerID="95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab" exitCode=143 Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.727300 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"44ceda77-074b-4930-97bd-49b9a73b217e","Type":"ContainerDied","Data":"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4"} Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.727371 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"44ceda77-074b-4930-97bd-49b9a73b217e","Type":"ContainerDied","Data":"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab"} Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.727380 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"44ceda77-074b-4930-97bd-49b9a73b217e","Type":"ContainerDied","Data":"b0576f2355b491c7a0cc0d37714ed30301fa52b8e2793d271fa7378efc8abb26"} Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.727394 4747 scope.go:117] "RemoveContainer" containerID="7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.727524 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.731068 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ceda77-074b-4930-97bd-49b9a73b217e-kube-api-access-d2pw6" (OuterVolumeSpecName: "kube-api-access-d2pw6") pod "44ceda77-074b-4930-97bd-49b9a73b217e" (UID: "44ceda77-074b-4930-97bd-49b9a73b217e"). InnerVolumeSpecName "kube-api-access-d2pw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.745662 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ece23be-fca5-4627-baf4-d25857fb7570" containerID="fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3" exitCode=143 Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.745707 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ece23be-fca5-4627-baf4-d25857fb7570","Type":"ContainerDied","Data":"fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3"} Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.810588 4747 scope.go:117] "RemoveContainer" containerID="95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.816148 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-config-data" (OuterVolumeSpecName: "config-data") pod "44ceda77-074b-4930-97bd-49b9a73b217e" (UID: "44ceda77-074b-4930-97bd-49b9a73b217e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.824787 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-combined-ca-bundle\") pod \"a493d172-e7e2-4ab6-984a-1e638a445f46\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.824869 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44ceda77-074b-4930-97bd-49b9a73b217e" (UID: "44ceda77-074b-4930-97bd-49b9a73b217e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.825053 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhdxx\" (UniqueName: \"kubernetes.io/projected/a493d172-e7e2-4ab6-984a-1e638a445f46-kube-api-access-rhdxx\") pod \"a493d172-e7e2-4ab6-984a-1e638a445f46\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.825120 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-internal-tls-certs\") pod \"44ceda77-074b-4930-97bd-49b9a73b217e\" (UID: \"44ceda77-074b-4930-97bd-49b9a73b217e\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.825151 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-config-data\") pod \"a493d172-e7e2-4ab6-984a-1e638a445f46\" (UID: \"a493d172-e7e2-4ab6-984a-1e638a445f46\") " Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.825527 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.825540 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2pw6\" (UniqueName: \"kubernetes.io/projected/44ceda77-074b-4930-97bd-49b9a73b217e-kube-api-access-d2pw6\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.825550 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44ceda77-074b-4930-97bd-49b9a73b217e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:09 crc kubenswrapper[4747]: W0930 19:07:09.826140 4747 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/44ceda77-074b-4930-97bd-49b9a73b217e/volumes/kubernetes.io~secret/internal-tls-certs Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.826161 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44ceda77-074b-4930-97bd-49b9a73b217e" (UID: "44ceda77-074b-4930-97bd-49b9a73b217e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.828391 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a493d172-e7e2-4ab6-984a-1e638a445f46-kube-api-access-rhdxx" (OuterVolumeSpecName: "kube-api-access-rhdxx") pod "a493d172-e7e2-4ab6-984a-1e638a445f46" (UID: "a493d172-e7e2-4ab6-984a-1e638a445f46"). InnerVolumeSpecName "kube-api-access-rhdxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.834434 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44ceda77-074b-4930-97bd-49b9a73b217e" (UID: "44ceda77-074b-4930-97bd-49b9a73b217e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.837026 4747 scope.go:117] "RemoveContainer" containerID="7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4" Sep 30 19:07:09 crc kubenswrapper[4747]: E0930 19:07:09.837495 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4\": container with ID starting with 7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4 not found: ID does not exist" containerID="7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.837545 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4"} err="failed to get container status \"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4\": rpc error: code = NotFound desc = could not find container \"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4\": container with ID starting with 7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4 not found: ID does not exist" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.837576 4747 scope.go:117] "RemoveContainer" containerID="95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab" Sep 30 19:07:09 crc kubenswrapper[4747]: E0930 19:07:09.837903 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab\": container with ID starting with 95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab not found: ID does not exist" containerID="95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.837997 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab"} err="failed to get container status \"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab\": rpc error: code = NotFound desc = could not find container \"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab\": container with ID starting with 95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab not found: ID does not exist" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.838014 4747 scope.go:117] "RemoveContainer" containerID="7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.838334 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4"} err="failed to get container status \"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4\": rpc error: code = NotFound desc = could not find container \"7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4\": container with ID starting with 7062c5bae45ed6e7dfc0930a88d9632d844c7d185483d5cad2689c76578073f4 not found: ID does not exist" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.838362 4747 scope.go:117] "RemoveContainer" containerID="95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.838607 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab"} err="failed to get container status \"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab\": rpc error: code = NotFound desc = could not find container \"95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab\": container with ID starting with 95bf6c02fbbd966d84652ccce9301a1ecaf91683e3265f808a4a33914ca702ab not found: ID does not exist" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.850884 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a493d172-e7e2-4ab6-984a-1e638a445f46" (UID: "a493d172-e7e2-4ab6-984a-1e638a445f46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.858741 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-config-data" (OuterVolumeSpecName: "config-data") pod "a493d172-e7e2-4ab6-984a-1e638a445f46" (UID: "a493d172-e7e2-4ab6-984a-1e638a445f46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.875699 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "44ceda77-074b-4930-97bd-49b9a73b217e" (UID: "44ceda77-074b-4930-97bd-49b9a73b217e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.927194 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.927234 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.927247 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhdxx\" (UniqueName: \"kubernetes.io/projected/a493d172-e7e2-4ab6-984a-1e638a445f46-kube-api-access-rhdxx\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.927261 4747 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.927271 4747 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ceda77-074b-4930-97bd-49b9a73b217e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:09 crc kubenswrapper[4747]: I0930 19:07:09.927279 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a493d172-e7e2-4ab6-984a-1e638a445f46-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.124393 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.131671 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.152606 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: E0930 19:07:10.153151 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" containerName="nova-api-api" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153185 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" containerName="nova-api-api" Sep 30 19:07:10 crc kubenswrapper[4747]: E0930 19:07:10.153213 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a493d172-e7e2-4ab6-984a-1e638a445f46" containerName="nova-scheduler-scheduler" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153228 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a493d172-e7e2-4ab6-984a-1e638a445f46" containerName="nova-scheduler-scheduler" Sep 30 19:07:10 crc kubenswrapper[4747]: E0930 19:07:10.153259 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7322c2b5-665e-4f0f-9680-002ba350622c" containerName="init" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153271 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7322c2b5-665e-4f0f-9680-002ba350622c" containerName="init" Sep 30 19:07:10 crc kubenswrapper[4747]: E0930 19:07:10.153298 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7322c2b5-665e-4f0f-9680-002ba350622c" containerName="dnsmasq-dns" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153311 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7322c2b5-665e-4f0f-9680-002ba350622c" containerName="dnsmasq-dns" Sep 30 19:07:10 crc kubenswrapper[4747]: E0930 19:07:10.153333 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3cf400-e562-4174-ac4a-7deb1d8d0be5" containerName="nova-manage" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153343 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3cf400-e562-4174-ac4a-7deb1d8d0be5" containerName="nova-manage" Sep 30 19:07:10 crc kubenswrapper[4747]: E0930 19:07:10.153366 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" containerName="nova-api-log" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153376 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" containerName="nova-api-log" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153743 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" containerName="nova-api-api" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153790 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a493d172-e7e2-4ab6-984a-1e638a445f46" containerName="nova-scheduler-scheduler" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153809 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3cf400-e562-4174-ac4a-7deb1d8d0be5" containerName="nova-manage" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153828 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7322c2b5-665e-4f0f-9680-002ba350622c" containerName="dnsmasq-dns" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.153845 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" containerName="nova-api-log" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.155390 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.159245 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.159518 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.159730 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.170261 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.232962 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdqqc\" (UniqueName: \"kubernetes.io/projected/6dbab27f-1fd5-4576-a5e7-41ec04946217-kube-api-access-sdqqc\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.232998 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.233034 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.233064 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-public-tls-certs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.233262 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-config-data\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.233368 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbab27f-1fd5-4576-a5e7-41ec04946217-logs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.335292 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.335379 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-public-tls-certs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.335493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-config-data\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.335564 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbab27f-1fd5-4576-a5e7-41ec04946217-logs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.335712 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.335752 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdqqc\" (UniqueName: \"kubernetes.io/projected/6dbab27f-1fd5-4576-a5e7-41ec04946217-kube-api-access-sdqqc\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.339690 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbab27f-1fd5-4576-a5e7-41ec04946217-logs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.344794 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-public-tls-certs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.344884 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.345526 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-config-data\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.345719 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbab27f-1fd5-4576-a5e7-41ec04946217-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.361441 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdqqc\" (UniqueName: \"kubernetes.io/projected/6dbab27f-1fd5-4576-a5e7-41ec04946217-kube-api-access-sdqqc\") pod \"nova-api-0\" (UID: \"6dbab27f-1fd5-4576-a5e7-41ec04946217\") " pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.477336 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.771283 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.771296 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a493d172-e7e2-4ab6-984a-1e638a445f46","Type":"ContainerDied","Data":"bbedd07ad5b08e9b357467792842959011b7f8add7219f211a372bab5be75557"} Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.771342 4747 scope.go:117] "RemoveContainer" containerID="f7565c76da7c9d3a4fb05b4ac4c84d07f2e4a54b67d751d53ed5df70fe85fc2d" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.819301 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.832778 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: W0930 19:07:10.834909 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dbab27f_1fd5_4576_a5e7_41ec04946217.slice/crio-29665d05d67ff22c95bab78da35c50196c8be584f0cd8951bec44382922726bf WatchSource:0}: Error finding container 29665d05d67ff22c95bab78da35c50196c8be584f0cd8951bec44382922726bf: Status 404 returned error can't find the container with id 29665d05d67ff22c95bab78da35c50196c8be584f0cd8951bec44382922726bf Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.842895 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.860920 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.862609 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.864657 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.871660 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.946124 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzzr\" (UniqueName: \"kubernetes.io/projected/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-kube-api-access-qrzzr\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.946308 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:10 crc kubenswrapper[4747]: I0930 19:07:10.946343 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-config-data\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.048790 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.048829 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-config-data\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.049002 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzzr\" (UniqueName: \"kubernetes.io/projected/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-kube-api-access-qrzzr\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.053570 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-config-data\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.054695 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.065454 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzzr\" (UniqueName: \"kubernetes.io/projected/fc984869-b7e5-4d68-9d09-8bf0f1815e2c-kube-api-access-qrzzr\") pod \"nova-scheduler-0\" (UID: \"fc984869-b7e5-4d68-9d09-8bf0f1815e2c\") " pod="openstack/nova-scheduler-0" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.102710 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ceda77-074b-4930-97bd-49b9a73b217e" path="/var/lib/kubelet/pods/44ceda77-074b-4930-97bd-49b9a73b217e/volumes" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.103837 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a493d172-e7e2-4ab6-984a-1e638a445f46" path="/var/lib/kubelet/pods/a493d172-e7e2-4ab6-984a-1e638a445f46/volumes" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.181391 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.690033 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.792919 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6dbab27f-1fd5-4576-a5e7-41ec04946217","Type":"ContainerStarted","Data":"c6c4f706962efa3e9407e8ceca4bdd14b71da8085e25dd73d36460c9c0f4c63e"} Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.793010 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6dbab27f-1fd5-4576-a5e7-41ec04946217","Type":"ContainerStarted","Data":"40f5a6fd432469eb74b3c21de98e69119f05ec1000b0b6265189ed6f4a586194"} Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.793033 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6dbab27f-1fd5-4576-a5e7-41ec04946217","Type":"ContainerStarted","Data":"29665d05d67ff22c95bab78da35c50196c8be584f0cd8951bec44382922726bf"} Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.801658 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc984869-b7e5-4d68-9d09-8bf0f1815e2c","Type":"ContainerStarted","Data":"3e962d2417a9ba099604acb36132f9da70e4134b6796ca6dee32c00166391ace"} Sep 30 19:07:11 crc kubenswrapper[4747]: I0930 19:07:11.825862 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.825837202 podStartE2EDuration="1.825837202s" podCreationTimestamp="2025-09-30 19:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:07:11.820805277 +0000 UTC m=+1271.480285461" watchObservedRunningTime="2025-09-30 19:07:11.825837202 +0000 UTC m=+1271.485317316" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.179967 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.172:8775/\": read tcp 10.217.0.2:33802->10.217.0.172:8775: read: connection reset by peer" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.179989 4747 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.172:8775/\": read tcp 10.217.0.2:33806->10.217.0.172:8775: read: connection reset by peer" Sep 30 19:07:12 crc kubenswrapper[4747]: E0930 19:07:12.356435 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ece23be_fca5_4627_baf4_d25857fb7570.slice/crio-conmon-7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288.scope\": RecentStats: unable to find data in memory cache]" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.642331 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.786136 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-combined-ca-bundle\") pod \"1ece23be-fca5-4627-baf4-d25857fb7570\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.786282 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ece23be-fca5-4627-baf4-d25857fb7570-logs\") pod \"1ece23be-fca5-4627-baf4-d25857fb7570\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.786370 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-config-data\") pod \"1ece23be-fca5-4627-baf4-d25857fb7570\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.786402 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-nova-metadata-tls-certs\") pod \"1ece23be-fca5-4627-baf4-d25857fb7570\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.786447 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjxjc\" (UniqueName: \"kubernetes.io/projected/1ece23be-fca5-4627-baf4-d25857fb7570-kube-api-access-cjxjc\") pod \"1ece23be-fca5-4627-baf4-d25857fb7570\" (UID: \"1ece23be-fca5-4627-baf4-d25857fb7570\") " Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.791430 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ece23be-fca5-4627-baf4-d25857fb7570-logs" (OuterVolumeSpecName: "logs") pod "1ece23be-fca5-4627-baf4-d25857fb7570" (UID: "1ece23be-fca5-4627-baf4-d25857fb7570"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.806163 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ece23be-fca5-4627-baf4-d25857fb7570-kube-api-access-cjxjc" (OuterVolumeSpecName: "kube-api-access-cjxjc") pod "1ece23be-fca5-4627-baf4-d25857fb7570" (UID: "1ece23be-fca5-4627-baf4-d25857fb7570"). InnerVolumeSpecName "kube-api-access-cjxjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.815577 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-config-data" (OuterVolumeSpecName: "config-data") pod "1ece23be-fca5-4627-baf4-d25857fb7570" (UID: "1ece23be-fca5-4627-baf4-d25857fb7570"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.817600 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc984869-b7e5-4d68-9d09-8bf0f1815e2c","Type":"ContainerStarted","Data":"86ddcad963f8ac6ee50358048f6f2531a9cb525c87b3ac2c5e09cf13166f42f6"} Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.823416 4747 generic.go:334] "Generic (PLEG): container finished" podID="1ece23be-fca5-4627-baf4-d25857fb7570" containerID="7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288" exitCode=0 Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.823471 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.823553 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ece23be-fca5-4627-baf4-d25857fb7570","Type":"ContainerDied","Data":"7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288"} Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.823587 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ece23be-fca5-4627-baf4-d25857fb7570","Type":"ContainerDied","Data":"b125e397baaa2e344483bf8e69ffe0995f5825e367ae099b93f2adce2b5b01e0"} Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.823610 4747 scope.go:117] "RemoveContainer" containerID="7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.842164 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ece23be-fca5-4627-baf4-d25857fb7570" (UID: "1ece23be-fca5-4627-baf4-d25857fb7570"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.842857 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.842835573 podStartE2EDuration="2.842835573s" podCreationTimestamp="2025-09-30 19:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:07:12.83856227 +0000 UTC m=+1272.498042384" watchObservedRunningTime="2025-09-30 19:07:12.842835573 +0000 UTC m=+1272.502315687" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.863258 4747 scope.go:117] "RemoveContainer" containerID="fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.864239 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1ece23be-fca5-4627-baf4-d25857fb7570" (UID: "1ece23be-fca5-4627-baf4-d25857fb7570"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.878875 4747 scope.go:117] "RemoveContainer" containerID="7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288" Sep 30 19:07:12 crc kubenswrapper[4747]: E0930 19:07:12.884940 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288\": container with ID starting with 7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288 not found: ID does not exist" containerID="7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.884981 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288"} err="failed to get container status \"7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288\": rpc error: code = NotFound desc = could not find container \"7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288\": container with ID starting with 7c45ec215b40d111ea2513bc901ca4377c21d56088bd399f60f351e5d0a35288 not found: ID does not exist" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.885008 4747 scope.go:117] "RemoveContainer" containerID="fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3" Sep 30 19:07:12 crc kubenswrapper[4747]: E0930 19:07:12.885409 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3\": container with ID starting with fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3 not found: ID does not exist" containerID="fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.885448 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3"} err="failed to get container status \"fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3\": rpc error: code = NotFound desc = could not find container \"fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3\": container with ID starting with fb976b5f8b1c830e77c68687a6d626328f81c0909c4b4fabadaaddae991681a3 not found: ID does not exist" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.889720 4747 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ece23be-fca5-4627-baf4-d25857fb7570-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.889745 4747 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.889762 4747 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.889772 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjxjc\" (UniqueName: \"kubernetes.io/projected/1ece23be-fca5-4627-baf4-d25857fb7570-kube-api-access-cjxjc\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:12 crc kubenswrapper[4747]: I0930 19:07:12.889782 4747 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ece23be-fca5-4627-baf4-d25857fb7570-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.145194 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.153731 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.167323 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:07:13 crc kubenswrapper[4747]: E0930 19:07:13.167731 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-log" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.167745 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-log" Sep 30 19:07:13 crc kubenswrapper[4747]: E0930 19:07:13.167763 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-metadata" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.167770 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-metadata" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.168026 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-metadata" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.168052 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" containerName="nova-metadata-log" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.168842 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.171750 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.171941 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.183179 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.296100 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-config-data\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.296207 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hdpt\" (UniqueName: \"kubernetes.io/projected/800e3b38-20c6-4605-98b5-67b9a1ef21f2-kube-api-access-7hdpt\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.296247 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.296296 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/800e3b38-20c6-4605-98b5-67b9a1ef21f2-logs\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.296338 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.397876 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/800e3b38-20c6-4605-98b5-67b9a1ef21f2-logs\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.397916 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.398011 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-config-data\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.398109 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hdpt\" (UniqueName: \"kubernetes.io/projected/800e3b38-20c6-4605-98b5-67b9a1ef21f2-kube-api-access-7hdpt\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.398142 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.398816 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/800e3b38-20c6-4605-98b5-67b9a1ef21f2-logs\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.403672 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.404264 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-config-data\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.405197 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/800e3b38-20c6-4605-98b5-67b9a1ef21f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.429488 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hdpt\" (UniqueName: \"kubernetes.io/projected/800e3b38-20c6-4605-98b5-67b9a1ef21f2-kube-api-access-7hdpt\") pod \"nova-metadata-0\" (UID: \"800e3b38-20c6-4605-98b5-67b9a1ef21f2\") " pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.481576 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:07:13 crc kubenswrapper[4747]: I0930 19:07:13.831832 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:07:13 crc kubenswrapper[4747]: W0930 19:07:13.871989 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod800e3b38_20c6_4605_98b5_67b9a1ef21f2.slice/crio-ffa01b32bc8a24dc7dc6e8015d150cf607fa713e24334cdc32ab4483cd5afded WatchSource:0}: Error finding container ffa01b32bc8a24dc7dc6e8015d150cf607fa713e24334cdc32ab4483cd5afded: Status 404 returned error can't find the container with id ffa01b32bc8a24dc7dc6e8015d150cf607fa713e24334cdc32ab4483cd5afded Sep 30 19:07:14 crc kubenswrapper[4747]: I0930 19:07:14.903724 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"800e3b38-20c6-4605-98b5-67b9a1ef21f2","Type":"ContainerStarted","Data":"b66bd5c48cbc47637683a6c9c7ec5504bb1e031750d4955168ef8763e7603d2d"} Sep 30 19:07:14 crc kubenswrapper[4747]: I0930 19:07:14.904108 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"800e3b38-20c6-4605-98b5-67b9a1ef21f2","Type":"ContainerStarted","Data":"42c92e95ada53ec2322762089e3f767dcc1148b93d2b128bcf2714d5de867176"} Sep 30 19:07:14 crc kubenswrapper[4747]: I0930 19:07:14.904130 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"800e3b38-20c6-4605-98b5-67b9a1ef21f2","Type":"ContainerStarted","Data":"ffa01b32bc8a24dc7dc6e8015d150cf607fa713e24334cdc32ab4483cd5afded"} Sep 30 19:07:14 crc kubenswrapper[4747]: I0930 19:07:14.939256 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.939229769 podStartE2EDuration="1.939229769s" podCreationTimestamp="2025-09-30 19:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:07:14.927751819 +0000 UTC m=+1274.587231973" watchObservedRunningTime="2025-09-30 19:07:14.939229769 +0000 UTC m=+1274.598709893" Sep 30 19:07:15 crc kubenswrapper[4747]: I0930 19:07:15.123104 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ece23be-fca5-4627-baf4-d25857fb7570" path="/var/lib/kubelet/pods/1ece23be-fca5-4627-baf4-d25857fb7570/volumes" Sep 30 19:07:16 crc kubenswrapper[4747]: I0930 19:07:16.183274 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 19:07:18 crc kubenswrapper[4747]: I0930 19:07:18.483203 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:07:18 crc kubenswrapper[4747]: I0930 19:07:18.483682 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:07:20 crc kubenswrapper[4747]: I0930 19:07:20.477917 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:07:20 crc kubenswrapper[4747]: I0930 19:07:20.478025 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:07:21 crc kubenswrapper[4747]: I0930 19:07:21.183186 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 19:07:21 crc kubenswrapper[4747]: I0930 19:07:21.224838 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 19:07:21 crc kubenswrapper[4747]: I0930 19:07:21.493195 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6dbab27f-1fd5-4576-a5e7-41ec04946217" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.179:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:07:21 crc kubenswrapper[4747]: I0930 19:07:21.493553 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6dbab27f-1fd5-4576-a5e7-41ec04946217" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:07:22 crc kubenswrapper[4747]: I0930 19:07:22.018070 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 19:07:23 crc kubenswrapper[4747]: I0930 19:07:23.483079 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 19:07:23 crc kubenswrapper[4747]: I0930 19:07:23.484263 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 19:07:24 crc kubenswrapper[4747]: I0930 19:07:24.497200 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="800e3b38-20c6-4605-98b5-67b9a1ef21f2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:07:24 crc kubenswrapper[4747]: I0930 19:07:24.497269 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="800e3b38-20c6-4605-98b5-67b9a1ef21f2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:07:30 crc kubenswrapper[4747]: I0930 19:07:30.489606 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 19:07:30 crc kubenswrapper[4747]: I0930 19:07:30.491217 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 19:07:30 crc kubenswrapper[4747]: I0930 19:07:30.492978 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 19:07:30 crc kubenswrapper[4747]: I0930 19:07:30.502619 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 19:07:31 crc kubenswrapper[4747]: I0930 19:07:31.111735 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 19:07:31 crc kubenswrapper[4747]: I0930 19:07:31.111841 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 19:07:33 crc kubenswrapper[4747]: I0930 19:07:33.491726 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 19:07:33 crc kubenswrapper[4747]: I0930 19:07:33.494583 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 19:07:33 crc kubenswrapper[4747]: I0930 19:07:33.501158 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 19:07:34 crc kubenswrapper[4747]: I0930 19:07:34.130748 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 19:07:37 crc kubenswrapper[4747]: I0930 19:07:37.655468 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:07:37 crc kubenswrapper[4747]: I0930 19:07:37.656062 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:08:07 crc kubenswrapper[4747]: I0930 19:08:07.655701 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:08:07 crc kubenswrapper[4747]: I0930 19:08:07.656674 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:08:07 crc kubenswrapper[4747]: I0930 19:08:07.656765 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 19:08:07 crc kubenswrapper[4747]: I0930 19:08:07.658068 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c119b3f4e3827265fb0e759c838f87c36311898c6328de90654d566acfc99097"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:08:07 crc kubenswrapper[4747]: I0930 19:08:07.658205 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://c119b3f4e3827265fb0e759c838f87c36311898c6328de90654d566acfc99097" gracePeriod=600 Sep 30 19:08:08 crc kubenswrapper[4747]: I0930 19:08:08.566040 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="c119b3f4e3827265fb0e759c838f87c36311898c6328de90654d566acfc99097" exitCode=0 Sep 30 19:08:08 crc kubenswrapper[4747]: I0930 19:08:08.566155 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"c119b3f4e3827265fb0e759c838f87c36311898c6328de90654d566acfc99097"} Sep 30 19:08:08 crc kubenswrapper[4747]: I0930 19:08:08.566965 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a"} Sep 30 19:08:08 crc kubenswrapper[4747]: I0930 19:08:08.567016 4747 scope.go:117] "RemoveContainer" containerID="2d1099e7b2e4398f262beea3d468545e7d66ad71310f8a69af5aae38ae2c6601" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.188508 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x2dtg"] Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.192985 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.206247 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x2dtg"] Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.269712 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx29r\" (UniqueName: \"kubernetes.io/projected/815d7109-6526-4517-bc93-171e3c100745-kube-api-access-vx29r\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.270109 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-catalog-content\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.270172 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-utilities\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.372479 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx29r\" (UniqueName: \"kubernetes.io/projected/815d7109-6526-4517-bc93-171e3c100745-kube-api-access-vx29r\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.372631 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-catalog-content\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.372744 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-utilities\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.373217 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-catalog-content\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.373322 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-utilities\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.401801 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx29r\" (UniqueName: \"kubernetes.io/projected/815d7109-6526-4517-bc93-171e3c100745-kube-api-access-vx29r\") pod \"certified-operators-x2dtg\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.535761 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.854355 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x2dtg"] Sep 30 19:10:02 crc kubenswrapper[4747]: I0930 19:10:02.914251 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x2dtg" event={"ID":"815d7109-6526-4517-bc93-171e3c100745","Type":"ContainerStarted","Data":"69ac65c51fb179ad0301f85817d93c6887ae3ba33d39930a4a829ee1e20fa8d8"} Sep 30 19:10:03 crc kubenswrapper[4747]: I0930 19:10:03.929640 4747 generic.go:334] "Generic (PLEG): container finished" podID="815d7109-6526-4517-bc93-171e3c100745" containerID="f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1" exitCode=0 Sep 30 19:10:03 crc kubenswrapper[4747]: I0930 19:10:03.929826 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x2dtg" event={"ID":"815d7109-6526-4517-bc93-171e3c100745","Type":"ContainerDied","Data":"f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1"} Sep 30 19:10:03 crc kubenswrapper[4747]: I0930 19:10:03.936105 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:10:05 crc kubenswrapper[4747]: I0930 19:10:05.958954 4747 generic.go:334] "Generic (PLEG): container finished" podID="815d7109-6526-4517-bc93-171e3c100745" containerID="897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf" exitCode=0 Sep 30 19:10:05 crc kubenswrapper[4747]: I0930 19:10:05.959004 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x2dtg" event={"ID":"815d7109-6526-4517-bc93-171e3c100745","Type":"ContainerDied","Data":"897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf"} Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.359377 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4r2jp"] Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.362647 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.385810 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4r2jp"] Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.458193 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-catalog-content\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.458378 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-utilities\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.458468 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv7nh\" (UniqueName: \"kubernetes.io/projected/d74509e7-d3a9-4229-ba8b-0e14349dcde8-kube-api-access-wv7nh\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.560464 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-utilities\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.560642 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv7nh\" (UniqueName: \"kubernetes.io/projected/d74509e7-d3a9-4229-ba8b-0e14349dcde8-kube-api-access-wv7nh\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.560915 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-catalog-content\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.561353 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-catalog-content\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.561540 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-utilities\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.596834 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv7nh\" (UniqueName: \"kubernetes.io/projected/d74509e7-d3a9-4229-ba8b-0e14349dcde8-kube-api-access-wv7nh\") pod \"redhat-operators-4r2jp\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.701818 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.951436 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4r2jp"] Sep 30 19:10:06 crc kubenswrapper[4747]: W0930 19:10:06.957671 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74509e7_d3a9_4229_ba8b_0e14349dcde8.slice/crio-e9b3be218e8f071c29b5a6fcfcb0296257d8673c3e02588127b8bd7fd98f77b6 WatchSource:0}: Error finding container e9b3be218e8f071c29b5a6fcfcb0296257d8673c3e02588127b8bd7fd98f77b6: Status 404 returned error can't find the container with id e9b3be218e8f071c29b5a6fcfcb0296257d8673c3e02588127b8bd7fd98f77b6 Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.974962 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4r2jp" event={"ID":"d74509e7-d3a9-4229-ba8b-0e14349dcde8","Type":"ContainerStarted","Data":"e9b3be218e8f071c29b5a6fcfcb0296257d8673c3e02588127b8bd7fd98f77b6"} Sep 30 19:10:06 crc kubenswrapper[4747]: I0930 19:10:06.977609 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x2dtg" event={"ID":"815d7109-6526-4517-bc93-171e3c100745","Type":"ContainerStarted","Data":"4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258"} Sep 30 19:10:07 crc kubenswrapper[4747]: I0930 19:10:07.001391 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x2dtg" podStartSLOduration=2.470840532 podStartE2EDuration="5.001373873s" podCreationTimestamp="2025-09-30 19:10:02 +0000 UTC" firstStartedPulling="2025-09-30 19:10:03.935646579 +0000 UTC m=+1443.595126723" lastFinishedPulling="2025-09-30 19:10:06.46617991 +0000 UTC m=+1446.125660064" observedRunningTime="2025-09-30 19:10:07.000524088 +0000 UTC m=+1446.660004212" watchObservedRunningTime="2025-09-30 19:10:07.001373873 +0000 UTC m=+1446.660853987" Sep 30 19:10:07 crc kubenswrapper[4747]: E0930 19:10:07.294033 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74509e7_d3a9_4229_ba8b_0e14349dcde8.slice/crio-27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed.scope\": RecentStats: unable to find data in memory cache]" Sep 30 19:10:07 crc kubenswrapper[4747]: I0930 19:10:07.655059 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:10:07 crc kubenswrapper[4747]: I0930 19:10:07.655369 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:10:07 crc kubenswrapper[4747]: I0930 19:10:07.991829 4747 generic.go:334] "Generic (PLEG): container finished" podID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerID="27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed" exitCode=0 Sep 30 19:10:07 crc kubenswrapper[4747]: I0930 19:10:07.991915 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4r2jp" event={"ID":"d74509e7-d3a9-4229-ba8b-0e14349dcde8","Type":"ContainerDied","Data":"27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed"} Sep 30 19:10:08 crc kubenswrapper[4747]: I0930 19:10:08.248534 4747 scope.go:117] "RemoveContainer" containerID="0b7e3684ca04ab67e4c4a80052fedafbed4b74a50fc994f4be598431326e2225" Sep 30 19:10:10 crc kubenswrapper[4747]: I0930 19:10:10.024807 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4r2jp" event={"ID":"d74509e7-d3a9-4229-ba8b-0e14349dcde8","Type":"ContainerStarted","Data":"fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0"} Sep 30 19:10:11 crc kubenswrapper[4747]: I0930 19:10:11.041558 4747 generic.go:334] "Generic (PLEG): container finished" podID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerID="fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0" exitCode=0 Sep 30 19:10:11 crc kubenswrapper[4747]: I0930 19:10:11.042033 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4r2jp" event={"ID":"d74509e7-d3a9-4229-ba8b-0e14349dcde8","Type":"ContainerDied","Data":"fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0"} Sep 30 19:10:12 crc kubenswrapper[4747]: I0930 19:10:12.536905 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:12 crc kubenswrapper[4747]: I0930 19:10:12.537379 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:12 crc kubenswrapper[4747]: I0930 19:10:12.617458 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:13 crc kubenswrapper[4747]: I0930 19:10:13.073487 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4r2jp" event={"ID":"d74509e7-d3a9-4229-ba8b-0e14349dcde8","Type":"ContainerStarted","Data":"45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94"} Sep 30 19:10:13 crc kubenswrapper[4747]: I0930 19:10:13.105859 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4r2jp" podStartSLOduration=2.857959804 podStartE2EDuration="7.105831934s" podCreationTimestamp="2025-09-30 19:10:06 +0000 UTC" firstStartedPulling="2025-09-30 19:10:07.996899225 +0000 UTC m=+1447.656379369" lastFinishedPulling="2025-09-30 19:10:12.244771335 +0000 UTC m=+1451.904251499" observedRunningTime="2025-09-30 19:10:13.09806874 +0000 UTC m=+1452.757548874" watchObservedRunningTime="2025-09-30 19:10:13.105831934 +0000 UTC m=+1452.765312088" Sep 30 19:10:13 crc kubenswrapper[4747]: I0930 19:10:13.155704 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:14 crc kubenswrapper[4747]: I0930 19:10:14.348093 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x2dtg"] Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.094237 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x2dtg" podUID="815d7109-6526-4517-bc93-171e3c100745" containerName="registry-server" containerID="cri-o://4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258" gracePeriod=2 Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.540501 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.655695 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-catalog-content\") pod \"815d7109-6526-4517-bc93-171e3c100745\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.656022 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-utilities\") pod \"815d7109-6526-4517-bc93-171e3c100745\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.656150 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx29r\" (UniqueName: \"kubernetes.io/projected/815d7109-6526-4517-bc93-171e3c100745-kube-api-access-vx29r\") pod \"815d7109-6526-4517-bc93-171e3c100745\" (UID: \"815d7109-6526-4517-bc93-171e3c100745\") " Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.656819 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-utilities" (OuterVolumeSpecName: "utilities") pod "815d7109-6526-4517-bc93-171e3c100745" (UID: "815d7109-6526-4517-bc93-171e3c100745"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.666910 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815d7109-6526-4517-bc93-171e3c100745-kube-api-access-vx29r" (OuterVolumeSpecName: "kube-api-access-vx29r") pod "815d7109-6526-4517-bc93-171e3c100745" (UID: "815d7109-6526-4517-bc93-171e3c100745"). InnerVolumeSpecName "kube-api-access-vx29r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.712338 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "815d7109-6526-4517-bc93-171e3c100745" (UID: "815d7109-6526-4517-bc93-171e3c100745"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.759672 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.760060 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815d7109-6526-4517-bc93-171e3c100745-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:15 crc kubenswrapper[4747]: I0930 19:10:15.760141 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx29r\" (UniqueName: \"kubernetes.io/projected/815d7109-6526-4517-bc93-171e3c100745-kube-api-access-vx29r\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.106007 4747 generic.go:334] "Generic (PLEG): container finished" podID="815d7109-6526-4517-bc93-171e3c100745" containerID="4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258" exitCode=0 Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.106300 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x2dtg" event={"ID":"815d7109-6526-4517-bc93-171e3c100745","Type":"ContainerDied","Data":"4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258"} Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.106327 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x2dtg" event={"ID":"815d7109-6526-4517-bc93-171e3c100745","Type":"ContainerDied","Data":"69ac65c51fb179ad0301f85817d93c6887ae3ba33d39930a4a829ee1e20fa8d8"} Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.106342 4747 scope.go:117] "RemoveContainer" containerID="4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.106471 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x2dtg" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.158045 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x2dtg"] Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.160382 4747 scope.go:117] "RemoveContainer" containerID="897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.166916 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x2dtg"] Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.190801 4747 scope.go:117] "RemoveContainer" containerID="f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.224889 4747 scope.go:117] "RemoveContainer" containerID="4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258" Sep 30 19:10:16 crc kubenswrapper[4747]: E0930 19:10:16.225299 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258\": container with ID starting with 4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258 not found: ID does not exist" containerID="4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.225334 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258"} err="failed to get container status \"4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258\": rpc error: code = NotFound desc = could not find container \"4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258\": container with ID starting with 4dc841aff29060158eb8282aed398e9e432f539b8082e81250451a1f58335258 not found: ID does not exist" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.225359 4747 scope.go:117] "RemoveContainer" containerID="897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf" Sep 30 19:10:16 crc kubenswrapper[4747]: E0930 19:10:16.225674 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf\": container with ID starting with 897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf not found: ID does not exist" containerID="897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.225697 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf"} err="failed to get container status \"897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf\": rpc error: code = NotFound desc = could not find container \"897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf\": container with ID starting with 897c61621a4839a72ee2edcbe006b5b226ae3aed92029f6ef7a438bda754ddcf not found: ID does not exist" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.225713 4747 scope.go:117] "RemoveContainer" containerID="f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1" Sep 30 19:10:16 crc kubenswrapper[4747]: E0930 19:10:16.225940 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1\": container with ID starting with f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1 not found: ID does not exist" containerID="f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.225964 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1"} err="failed to get container status \"f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1\": rpc error: code = NotFound desc = could not find container \"f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1\": container with ID starting with f1cb1af43b49e317b5da6dba195fd4eee72ecc8da1e38368795d4e042ff51ca1 not found: ID does not exist" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.702456 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:16 crc kubenswrapper[4747]: I0930 19:10:16.702818 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:17 crc kubenswrapper[4747]: I0930 19:10:17.104820 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815d7109-6526-4517-bc93-171e3c100745" path="/var/lib/kubelet/pods/815d7109-6526-4517-bc93-171e3c100745/volumes" Sep 30 19:10:17 crc kubenswrapper[4747]: I0930 19:10:17.779870 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4r2jp" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="registry-server" probeResult="failure" output=< Sep 30 19:10:17 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Sep 30 19:10:17 crc kubenswrapper[4747]: > Sep 30 19:10:18 crc kubenswrapper[4747]: I0930 19:10:18.973743 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m4clh"] Sep 30 19:10:18 crc kubenswrapper[4747]: E0930 19:10:18.974356 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815d7109-6526-4517-bc93-171e3c100745" containerName="extract-utilities" Sep 30 19:10:18 crc kubenswrapper[4747]: I0930 19:10:18.974383 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="815d7109-6526-4517-bc93-171e3c100745" containerName="extract-utilities" Sep 30 19:10:18 crc kubenswrapper[4747]: E0930 19:10:18.974410 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815d7109-6526-4517-bc93-171e3c100745" containerName="extract-content" Sep 30 19:10:18 crc kubenswrapper[4747]: I0930 19:10:18.974423 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="815d7109-6526-4517-bc93-171e3c100745" containerName="extract-content" Sep 30 19:10:18 crc kubenswrapper[4747]: E0930 19:10:18.974486 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815d7109-6526-4517-bc93-171e3c100745" containerName="registry-server" Sep 30 19:10:18 crc kubenswrapper[4747]: I0930 19:10:18.974499 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="815d7109-6526-4517-bc93-171e3c100745" containerName="registry-server" Sep 30 19:10:18 crc kubenswrapper[4747]: I0930 19:10:18.974802 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="815d7109-6526-4517-bc93-171e3c100745" containerName="registry-server" Sep 30 19:10:18 crc kubenswrapper[4747]: I0930 19:10:18.977092 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.007137 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4clh"] Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.020276 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-catalog-content\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.020532 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-utilities\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.020589 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-597tx\" (UniqueName: \"kubernetes.io/projected/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-kube-api-access-597tx\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.122442 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-utilities\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.122493 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-597tx\" (UniqueName: \"kubernetes.io/projected/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-kube-api-access-597tx\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.122783 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-catalog-content\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.123435 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-catalog-content\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.124545 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-utilities\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.157003 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-597tx\" (UniqueName: \"kubernetes.io/projected/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-kube-api-access-597tx\") pod \"redhat-marketplace-m4clh\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.309733 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:19 crc kubenswrapper[4747]: I0930 19:10:19.551766 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4clh"] Sep 30 19:10:20 crc kubenswrapper[4747]: I0930 19:10:20.153893 4747 generic.go:334] "Generic (PLEG): container finished" podID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerID="6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524" exitCode=0 Sep 30 19:10:20 crc kubenswrapper[4747]: I0930 19:10:20.154008 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4clh" event={"ID":"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129","Type":"ContainerDied","Data":"6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524"} Sep 30 19:10:20 crc kubenswrapper[4747]: I0930 19:10:20.154064 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4clh" event={"ID":"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129","Type":"ContainerStarted","Data":"b0b396a6665fcc7547eabe0674c86cb28b0a14bd7256e47ea298065c4f1870ba"} Sep 30 19:10:22 crc kubenswrapper[4747]: I0930 19:10:22.179560 4747 generic.go:334] "Generic (PLEG): container finished" podID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerID="5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4" exitCode=0 Sep 30 19:10:22 crc kubenswrapper[4747]: I0930 19:10:22.179714 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4clh" event={"ID":"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129","Type":"ContainerDied","Data":"5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4"} Sep 30 19:10:23 crc kubenswrapper[4747]: I0930 19:10:23.193400 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4clh" event={"ID":"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129","Type":"ContainerStarted","Data":"a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981"} Sep 30 19:10:23 crc kubenswrapper[4747]: I0930 19:10:23.234321 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m4clh" podStartSLOduration=2.685863356 podStartE2EDuration="5.234289322s" podCreationTimestamp="2025-09-30 19:10:18 +0000 UTC" firstStartedPulling="2025-09-30 19:10:20.157667855 +0000 UTC m=+1459.817148009" lastFinishedPulling="2025-09-30 19:10:22.706093821 +0000 UTC m=+1462.365573975" observedRunningTime="2025-09-30 19:10:23.222067851 +0000 UTC m=+1462.881547985" watchObservedRunningTime="2025-09-30 19:10:23.234289322 +0000 UTC m=+1462.893769486" Sep 30 19:10:26 crc kubenswrapper[4747]: I0930 19:10:26.760465 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:26 crc kubenswrapper[4747]: I0930 19:10:26.844313 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:27 crc kubenswrapper[4747]: I0930 19:10:27.141593 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4r2jp"] Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.271791 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4r2jp" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="registry-server" containerID="cri-o://45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94" gracePeriod=2 Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.706850 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.835383 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv7nh\" (UniqueName: \"kubernetes.io/projected/d74509e7-d3a9-4229-ba8b-0e14349dcde8-kube-api-access-wv7nh\") pod \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.835473 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-catalog-content\") pod \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.835583 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-utilities\") pod \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\" (UID: \"d74509e7-d3a9-4229-ba8b-0e14349dcde8\") " Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.837787 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-utilities" (OuterVolumeSpecName: "utilities") pod "d74509e7-d3a9-4229-ba8b-0e14349dcde8" (UID: "d74509e7-d3a9-4229-ba8b-0e14349dcde8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.845196 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74509e7-d3a9-4229-ba8b-0e14349dcde8-kube-api-access-wv7nh" (OuterVolumeSpecName: "kube-api-access-wv7nh") pod "d74509e7-d3a9-4229-ba8b-0e14349dcde8" (UID: "d74509e7-d3a9-4229-ba8b-0e14349dcde8"). InnerVolumeSpecName "kube-api-access-wv7nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.932976 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d74509e7-d3a9-4229-ba8b-0e14349dcde8" (UID: "d74509e7-d3a9-4229-ba8b-0e14349dcde8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.939078 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.939161 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv7nh\" (UniqueName: \"kubernetes.io/projected/d74509e7-d3a9-4229-ba8b-0e14349dcde8-kube-api-access-wv7nh\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:28 crc kubenswrapper[4747]: I0930 19:10:28.939182 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d74509e7-d3a9-4229-ba8b-0e14349dcde8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.288318 4747 generic.go:334] "Generic (PLEG): container finished" podID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerID="45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94" exitCode=0 Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.288412 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4r2jp" event={"ID":"d74509e7-d3a9-4229-ba8b-0e14349dcde8","Type":"ContainerDied","Data":"45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94"} Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.288474 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4r2jp" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.288509 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4r2jp" event={"ID":"d74509e7-d3a9-4229-ba8b-0e14349dcde8","Type":"ContainerDied","Data":"e9b3be218e8f071c29b5a6fcfcb0296257d8673c3e02588127b8bd7fd98f77b6"} Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.288548 4747 scope.go:117] "RemoveContainer" containerID="45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.310577 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.310666 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.329245 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4r2jp"] Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.336760 4747 scope.go:117] "RemoveContainer" containerID="fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.344399 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4r2jp"] Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.375214 4747 scope.go:117] "RemoveContainer" containerID="27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.401131 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.433341 4747 scope.go:117] "RemoveContainer" containerID="45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94" Sep 30 19:10:29 crc kubenswrapper[4747]: E0930 19:10:29.437435 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94\": container with ID starting with 45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94 not found: ID does not exist" containerID="45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.437689 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94"} err="failed to get container status \"45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94\": rpc error: code = NotFound desc = could not find container \"45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94\": container with ID starting with 45444a17a1cc8a1974cf728b13d6ea8af3cd1a39f9ad96aceadffe5869102c94 not found: ID does not exist" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.437851 4747 scope.go:117] "RemoveContainer" containerID="fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0" Sep 30 19:10:29 crc kubenswrapper[4747]: E0930 19:10:29.438693 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0\": container with ID starting with fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0 not found: ID does not exist" containerID="fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.438795 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0"} err="failed to get container status \"fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0\": rpc error: code = NotFound desc = could not find container \"fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0\": container with ID starting with fdf7c707da0a00f5a69d7bacd947597f44a5a3200b4f64557359808e77ab2ae0 not found: ID does not exist" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.438840 4747 scope.go:117] "RemoveContainer" containerID="27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed" Sep 30 19:10:29 crc kubenswrapper[4747]: E0930 19:10:29.439700 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed\": container with ID starting with 27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed not found: ID does not exist" containerID="27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed" Sep 30 19:10:29 crc kubenswrapper[4747]: I0930 19:10:29.439901 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed"} err="failed to get container status \"27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed\": rpc error: code = NotFound desc = could not find container \"27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed\": container with ID starting with 27c0dc1cc45175868f9ef676fdd7e255080d11097baa363cbce612057fa617ed not found: ID does not exist" Sep 30 19:10:30 crc kubenswrapper[4747]: I0930 19:10:30.368670 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:31 crc kubenswrapper[4747]: I0930 19:10:31.107543 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" path="/var/lib/kubelet/pods/d74509e7-d3a9-4229-ba8b-0e14349dcde8/volumes" Sep 30 19:10:31 crc kubenswrapper[4747]: I0930 19:10:31.537261 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4clh"] Sep 30 19:10:32 crc kubenswrapper[4747]: I0930 19:10:32.322974 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m4clh" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerName="registry-server" containerID="cri-o://a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981" gracePeriod=2 Sep 30 19:10:32 crc kubenswrapper[4747]: I0930 19:10:32.911025 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.024704 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-utilities\") pod \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.024831 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-catalog-content\") pod \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.024882 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-597tx\" (UniqueName: \"kubernetes.io/projected/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-kube-api-access-597tx\") pod \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\" (UID: \"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129\") " Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.026369 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-utilities" (OuterVolumeSpecName: "utilities") pod "8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" (UID: "8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.044545 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-kube-api-access-597tx" (OuterVolumeSpecName: "kube-api-access-597tx") pod "8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" (UID: "8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129"). InnerVolumeSpecName "kube-api-access-597tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.051837 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" (UID: "8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.128075 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.128112 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.128127 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-597tx\" (UniqueName: \"kubernetes.io/projected/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129-kube-api-access-597tx\") on node \"crc\" DevicePath \"\"" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.341827 4747 generic.go:334] "Generic (PLEG): container finished" podID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerID="a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981" exitCode=0 Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.341997 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4clh" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.342009 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4clh" event={"ID":"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129","Type":"ContainerDied","Data":"a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981"} Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.342452 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4clh" event={"ID":"8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129","Type":"ContainerDied","Data":"b0b396a6665fcc7547eabe0674c86cb28b0a14bd7256e47ea298065c4f1870ba"} Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.342487 4747 scope.go:117] "RemoveContainer" containerID="a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.384267 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4clh"] Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.390374 4747 scope.go:117] "RemoveContainer" containerID="5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.399101 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4clh"] Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.425570 4747 scope.go:117] "RemoveContainer" containerID="6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.505499 4747 scope.go:117] "RemoveContainer" containerID="a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981" Sep 30 19:10:33 crc kubenswrapper[4747]: E0930 19:10:33.506092 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981\": container with ID starting with a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981 not found: ID does not exist" containerID="a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.506160 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981"} err="failed to get container status \"a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981\": rpc error: code = NotFound desc = could not find container \"a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981\": container with ID starting with a6b569b5f21b1ad7d2ecc128fe731f51844e351747789a3fae83e0efea1d5981 not found: ID does not exist" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.506200 4747 scope.go:117] "RemoveContainer" containerID="5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4" Sep 30 19:10:33 crc kubenswrapper[4747]: E0930 19:10:33.506592 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4\": container with ID starting with 5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4 not found: ID does not exist" containerID="5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.506645 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4"} err="failed to get container status \"5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4\": rpc error: code = NotFound desc = could not find container \"5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4\": container with ID starting with 5fdc64b3ae49f307034ab1f3171631d62566a7278ae958050796efddca4512b4 not found: ID does not exist" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.506679 4747 scope.go:117] "RemoveContainer" containerID="6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524" Sep 30 19:10:33 crc kubenswrapper[4747]: E0930 19:10:33.507027 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524\": container with ID starting with 6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524 not found: ID does not exist" containerID="6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524" Sep 30 19:10:33 crc kubenswrapper[4747]: I0930 19:10:33.507063 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524"} err="failed to get container status \"6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524\": rpc error: code = NotFound desc = could not find container \"6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524\": container with ID starting with 6a5a947b511cb1e73e941f61f3ace23a2b76caeef9893e6ff8b04a021ce6a524 not found: ID does not exist" Sep 30 19:10:35 crc kubenswrapper[4747]: I0930 19:10:35.107168 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" path="/var/lib/kubelet/pods/8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129/volumes" Sep 30 19:10:37 crc kubenswrapper[4747]: I0930 19:10:37.655868 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:10:37 crc kubenswrapper[4747]: I0930 19:10:37.656520 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.054247 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6vtp"] Sep 30 19:11:04 crc kubenswrapper[4747]: E0930 19:11:04.055710 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerName="extract-content" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.055750 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerName="extract-content" Sep 30 19:11:04 crc kubenswrapper[4747]: E0930 19:11:04.055786 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="extract-utilities" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.055807 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="extract-utilities" Sep 30 19:11:04 crc kubenswrapper[4747]: E0930 19:11:04.055832 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerName="extract-utilities" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.055853 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerName="extract-utilities" Sep 30 19:11:04 crc kubenswrapper[4747]: E0930 19:11:04.055958 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="extract-content" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.055980 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="extract-content" Sep 30 19:11:04 crc kubenswrapper[4747]: E0930 19:11:04.056021 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="registry-server" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.056043 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="registry-server" Sep 30 19:11:04 crc kubenswrapper[4747]: E0930 19:11:04.056078 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerName="registry-server" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.056096 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerName="registry-server" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.056493 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74509e7-d3a9-4229-ba8b-0e14349dcde8" containerName="registry-server" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.056538 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="8847c1e4-58ae-4dd8-9d4d-f2fcf5bb8129" containerName="registry-server" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.073301 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.080903 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6vtp"] Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.239955 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-utilities\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.240262 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-724fc\" (UniqueName: \"kubernetes.io/projected/cb42bb1a-2802-4841-84b5-d81d90a1e554-kube-api-access-724fc\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.240361 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-catalog-content\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.358937 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-utilities\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.359006 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-724fc\" (UniqueName: \"kubernetes.io/projected/cb42bb1a-2802-4841-84b5-d81d90a1e554-kube-api-access-724fc\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.359149 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-catalog-content\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.359558 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-utilities\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.359588 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-catalog-content\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.382887 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-724fc\" (UniqueName: \"kubernetes.io/projected/cb42bb1a-2802-4841-84b5-d81d90a1e554-kube-api-access-724fc\") pod \"community-operators-m6vtp\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.403344 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:04 crc kubenswrapper[4747]: I0930 19:11:04.892622 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6vtp"] Sep 30 19:11:04 crc kubenswrapper[4747]: W0930 19:11:04.903329 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb42bb1a_2802_4841_84b5_d81d90a1e554.slice/crio-f9ee877a481831f41e16e57a16f185c529a19b3d68403dcb4d7048fd39295286 WatchSource:0}: Error finding container f9ee877a481831f41e16e57a16f185c529a19b3d68403dcb4d7048fd39295286: Status 404 returned error can't find the container with id f9ee877a481831f41e16e57a16f185c529a19b3d68403dcb4d7048fd39295286 Sep 30 19:11:05 crc kubenswrapper[4747]: I0930 19:11:05.701204 4747 generic.go:334] "Generic (PLEG): container finished" podID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerID="9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1" exitCode=0 Sep 30 19:11:05 crc kubenswrapper[4747]: I0930 19:11:05.701367 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6vtp" event={"ID":"cb42bb1a-2802-4841-84b5-d81d90a1e554","Type":"ContainerDied","Data":"9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1"} Sep 30 19:11:05 crc kubenswrapper[4747]: I0930 19:11:05.701796 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6vtp" event={"ID":"cb42bb1a-2802-4841-84b5-d81d90a1e554","Type":"ContainerStarted","Data":"f9ee877a481831f41e16e57a16f185c529a19b3d68403dcb4d7048fd39295286"} Sep 30 19:11:07 crc kubenswrapper[4747]: I0930 19:11:07.656080 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:11:07 crc kubenswrapper[4747]: I0930 19:11:07.656883 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:11:07 crc kubenswrapper[4747]: I0930 19:11:07.656994 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 19:11:07 crc kubenswrapper[4747]: I0930 19:11:07.658098 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:11:07 crc kubenswrapper[4747]: I0930 19:11:07.658195 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" gracePeriod=600 Sep 30 19:11:07 crc kubenswrapper[4747]: I0930 19:11:07.730081 4747 generic.go:334] "Generic (PLEG): container finished" podID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerID="8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4" exitCode=0 Sep 30 19:11:07 crc kubenswrapper[4747]: I0930 19:11:07.730144 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6vtp" event={"ID":"cb42bb1a-2802-4841-84b5-d81d90a1e554","Type":"ContainerDied","Data":"8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4"} Sep 30 19:11:07 crc kubenswrapper[4747]: E0930 19:11:07.796506 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:11:08 crc kubenswrapper[4747]: I0930 19:11:08.336564 4747 scope.go:117] "RemoveContainer" containerID="a26df6585ef0623801339b8dcc91545c365b3eda853776c4f0c687c5fd7b10ec" Sep 30 19:11:08 crc kubenswrapper[4747]: I0930 19:11:08.746892 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6vtp" event={"ID":"cb42bb1a-2802-4841-84b5-d81d90a1e554","Type":"ContainerStarted","Data":"da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd"} Sep 30 19:11:08 crc kubenswrapper[4747]: I0930 19:11:08.751078 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" exitCode=0 Sep 30 19:11:08 crc kubenswrapper[4747]: I0930 19:11:08.751146 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a"} Sep 30 19:11:08 crc kubenswrapper[4747]: I0930 19:11:08.751199 4747 scope.go:117] "RemoveContainer" containerID="c119b3f4e3827265fb0e759c838f87c36311898c6328de90654d566acfc99097" Sep 30 19:11:08 crc kubenswrapper[4747]: I0930 19:11:08.752120 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:11:08 crc kubenswrapper[4747]: E0930 19:11:08.752698 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:11:08 crc kubenswrapper[4747]: I0930 19:11:08.777275 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6vtp" podStartSLOduration=2.261876198 podStartE2EDuration="4.777251754s" podCreationTimestamp="2025-09-30 19:11:04 +0000 UTC" firstStartedPulling="2025-09-30 19:11:05.707216527 +0000 UTC m=+1505.366696671" lastFinishedPulling="2025-09-30 19:11:08.222592083 +0000 UTC m=+1507.882072227" observedRunningTime="2025-09-30 19:11:08.773000062 +0000 UTC m=+1508.432480266" watchObservedRunningTime="2025-09-30 19:11:08.777251754 +0000 UTC m=+1508.436731878" Sep 30 19:11:14 crc kubenswrapper[4747]: I0930 19:11:14.403902 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:14 crc kubenswrapper[4747]: I0930 19:11:14.405883 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:14 crc kubenswrapper[4747]: I0930 19:11:14.489563 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:14 crc kubenswrapper[4747]: I0930 19:11:14.919125 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:14 crc kubenswrapper[4747]: I0930 19:11:14.994183 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6vtp"] Sep 30 19:11:16 crc kubenswrapper[4747]: I0930 19:11:16.871191 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m6vtp" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerName="registry-server" containerID="cri-o://da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd" gracePeriod=2 Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.285748 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.417027 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-catalog-content\") pod \"cb42bb1a-2802-4841-84b5-d81d90a1e554\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.417568 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-724fc\" (UniqueName: \"kubernetes.io/projected/cb42bb1a-2802-4841-84b5-d81d90a1e554-kube-api-access-724fc\") pod \"cb42bb1a-2802-4841-84b5-d81d90a1e554\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.417664 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-utilities\") pod \"cb42bb1a-2802-4841-84b5-d81d90a1e554\" (UID: \"cb42bb1a-2802-4841-84b5-d81d90a1e554\") " Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.420505 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-utilities" (OuterVolumeSpecName: "utilities") pod "cb42bb1a-2802-4841-84b5-d81d90a1e554" (UID: "cb42bb1a-2802-4841-84b5-d81d90a1e554"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.428805 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb42bb1a-2802-4841-84b5-d81d90a1e554-kube-api-access-724fc" (OuterVolumeSpecName: "kube-api-access-724fc") pod "cb42bb1a-2802-4841-84b5-d81d90a1e554" (UID: "cb42bb1a-2802-4841-84b5-d81d90a1e554"). InnerVolumeSpecName "kube-api-access-724fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.520343 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-724fc\" (UniqueName: \"kubernetes.io/projected/cb42bb1a-2802-4841-84b5-d81d90a1e554-kube-api-access-724fc\") on node \"crc\" DevicePath \"\"" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.520389 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.698330 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb42bb1a-2802-4841-84b5-d81d90a1e554" (UID: "cb42bb1a-2802-4841-84b5-d81d90a1e554"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.727336 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb42bb1a-2802-4841-84b5-d81d90a1e554-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.885712 4747 generic.go:334] "Generic (PLEG): container finished" podID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerID="da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd" exitCode=0 Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.885761 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6vtp" event={"ID":"cb42bb1a-2802-4841-84b5-d81d90a1e554","Type":"ContainerDied","Data":"da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd"} Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.885795 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6vtp" event={"ID":"cb42bb1a-2802-4841-84b5-d81d90a1e554","Type":"ContainerDied","Data":"f9ee877a481831f41e16e57a16f185c529a19b3d68403dcb4d7048fd39295286"} Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.885816 4747 scope.go:117] "RemoveContainer" containerID="da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.885824 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6vtp" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.926096 4747 scope.go:117] "RemoveContainer" containerID="8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4" Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.941191 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6vtp"] Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.954523 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m6vtp"] Sep 30 19:11:17 crc kubenswrapper[4747]: I0930 19:11:17.964282 4747 scope.go:117] "RemoveContainer" containerID="9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1" Sep 30 19:11:18 crc kubenswrapper[4747]: I0930 19:11:18.024149 4747 scope.go:117] "RemoveContainer" containerID="da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd" Sep 30 19:11:18 crc kubenswrapper[4747]: E0930 19:11:18.025263 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd\": container with ID starting with da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd not found: ID does not exist" containerID="da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd" Sep 30 19:11:18 crc kubenswrapper[4747]: I0930 19:11:18.025326 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd"} err="failed to get container status \"da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd\": rpc error: code = NotFound desc = could not find container \"da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd\": container with ID starting with da1699b25fe6751dae3e7137391892f1063621d7f8b4340d4d3ffbb3d96359bd not found: ID does not exist" Sep 30 19:11:18 crc kubenswrapper[4747]: I0930 19:11:18.025368 4747 scope.go:117] "RemoveContainer" containerID="8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4" Sep 30 19:11:18 crc kubenswrapper[4747]: E0930 19:11:18.026011 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4\": container with ID starting with 8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4 not found: ID does not exist" containerID="8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4" Sep 30 19:11:18 crc kubenswrapper[4747]: I0930 19:11:18.026061 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4"} err="failed to get container status \"8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4\": rpc error: code = NotFound desc = could not find container \"8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4\": container with ID starting with 8e0ab1aa0ab16b5d25d0e39cf93650dd879d176d83934e8bb16b5a076dc959a4 not found: ID does not exist" Sep 30 19:11:18 crc kubenswrapper[4747]: I0930 19:11:18.026092 4747 scope.go:117] "RemoveContainer" containerID="9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1" Sep 30 19:11:18 crc kubenswrapper[4747]: E0930 19:11:18.026462 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1\": container with ID starting with 9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1 not found: ID does not exist" containerID="9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1" Sep 30 19:11:18 crc kubenswrapper[4747]: I0930 19:11:18.026513 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1"} err="failed to get container status \"9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1\": rpc error: code = NotFound desc = could not find container \"9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1\": container with ID starting with 9f94f3513e519be156db1129dae9455b3c414523a8798e281b9c3546dca299f1 not found: ID does not exist" Sep 30 19:11:19 crc kubenswrapper[4747]: I0930 19:11:19.088514 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:11:19 crc kubenswrapper[4747]: E0930 19:11:19.089078 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:11:19 crc kubenswrapper[4747]: I0930 19:11:19.107323 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" path="/var/lib/kubelet/pods/cb42bb1a-2802-4841-84b5-d81d90a1e554/volumes" Sep 30 19:11:31 crc kubenswrapper[4747]: I0930 19:11:31.098993 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:11:31 crc kubenswrapper[4747]: E0930 19:11:31.100095 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:11:43 crc kubenswrapper[4747]: I0930 19:11:43.088918 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:11:43 crc kubenswrapper[4747]: E0930 19:11:43.090150 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:11:55 crc kubenswrapper[4747]: I0930 19:11:55.087609 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:11:55 crc kubenswrapper[4747]: E0930 19:11:55.088515 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:12:08 crc kubenswrapper[4747]: I0930 19:12:08.087436 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:12:08 crc kubenswrapper[4747]: E0930 19:12:08.088720 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:12:23 crc kubenswrapper[4747]: I0930 19:12:23.088066 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:12:23 crc kubenswrapper[4747]: E0930 19:12:23.089451 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:12:36 crc kubenswrapper[4747]: I0930 19:12:36.087816 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:12:36 crc kubenswrapper[4747]: E0930 19:12:36.088485 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:12:48 crc kubenswrapper[4747]: I0930 19:12:48.087760 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:12:48 crc kubenswrapper[4747]: E0930 19:12:48.088591 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:13:02 crc kubenswrapper[4747]: I0930 19:13:02.086519 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:13:02 crc kubenswrapper[4747]: E0930 19:13:02.087498 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:13:08 crc kubenswrapper[4747]: I0930 19:13:08.521820 4747 scope.go:117] "RemoveContainer" containerID="fe38fd93c43851379580425b30ed9e4910031c92756ba0b704ebaf573ed1c8aa" Sep 30 19:13:13 crc kubenswrapper[4747]: I0930 19:13:13.087648 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:13:13 crc kubenswrapper[4747]: E0930 19:13:13.088786 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:13:26 crc kubenswrapper[4747]: I0930 19:13:26.087395 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:13:26 crc kubenswrapper[4747]: E0930 19:13:26.088502 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:13:40 crc kubenswrapper[4747]: I0930 19:13:40.087618 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:13:40 crc kubenswrapper[4747]: E0930 19:13:40.088604 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:13:52 crc kubenswrapper[4747]: I0930 19:13:52.087960 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:13:52 crc kubenswrapper[4747]: E0930 19:13:52.089167 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:13:55 crc kubenswrapper[4747]: I0930 19:13:55.064171 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-s8xfv"] Sep 30 19:13:55 crc kubenswrapper[4747]: I0930 19:13:55.080863 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-s8xfv"] Sep 30 19:13:55 crc kubenswrapper[4747]: I0930 19:13:55.113103 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41949c60-b7d7-4810-8c5e-67a739ec4f17" path="/var/lib/kubelet/pods/41949c60-b7d7-4810-8c5e-67a739ec4f17/volumes" Sep 30 19:13:59 crc kubenswrapper[4747]: I0930 19:13:59.043555 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ndvkc"] Sep 30 19:13:59 crc kubenswrapper[4747]: I0930 19:13:59.061771 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9cn8j"] Sep 30 19:13:59 crc kubenswrapper[4747]: I0930 19:13:59.071209 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9cn8j"] Sep 30 19:13:59 crc kubenswrapper[4747]: I0930 19:13:59.079569 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ndvkc"] Sep 30 19:13:59 crc kubenswrapper[4747]: I0930 19:13:59.103051 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab203f58-7cb7-497d-9a9b-201f39531e63" path="/var/lib/kubelet/pods/ab203f58-7cb7-497d-9a9b-201f39531e63/volumes" Sep 30 19:13:59 crc kubenswrapper[4747]: I0930 19:13:59.103871 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3eb9234-0905-4ab0-ae40-5829d01890aa" path="/var/lib/kubelet/pods/e3eb9234-0905-4ab0-ae40-5829d01890aa/volumes" Sep 30 19:14:03 crc kubenswrapper[4747]: I0930 19:14:03.087590 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:14:03 crc kubenswrapper[4747]: E0930 19:14:03.088449 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:14:05 crc kubenswrapper[4747]: I0930 19:14:05.047549 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-96e4-account-create-qbklj"] Sep 30 19:14:05 crc kubenswrapper[4747]: I0930 19:14:05.064328 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-96e4-account-create-qbklj"] Sep 30 19:14:05 crc kubenswrapper[4747]: I0930 19:14:05.103477 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7f8831-1689-4327-a4ce-ff770e5840ec" path="/var/lib/kubelet/pods/7c7f8831-1689-4327-a4ce-ff770e5840ec/volumes" Sep 30 19:14:08 crc kubenswrapper[4747]: I0930 19:14:08.604234 4747 scope.go:117] "RemoveContainer" containerID="987dccbdaae285fe943b3af90192e9ef8de35690eadcbdfce774c27d5b9c25b7" Sep 30 19:14:08 crc kubenswrapper[4747]: I0930 19:14:08.638251 4747 scope.go:117] "RemoveContainer" containerID="0a42e01b990fa64d8599727fb18de6d3705d4a3499aedd36c074216640c98176" Sep 30 19:14:08 crc kubenswrapper[4747]: I0930 19:14:08.694134 4747 scope.go:117] "RemoveContainer" containerID="29c6d379cc86e0b0977c216ea3552b824ae6bfbd0fa4d87b89a756788d7a4fa6" Sep 30 19:14:08 crc kubenswrapper[4747]: I0930 19:14:08.744569 4747 scope.go:117] "RemoveContainer" containerID="5c3efdef7c3c30deb2ac09b34ed58a1a6b899b49e34d3078d56ec0199261dae5" Sep 30 19:14:09 crc kubenswrapper[4747]: I0930 19:14:09.050288 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a085-account-create-t62dz"] Sep 30 19:14:09 crc kubenswrapper[4747]: I0930 19:14:09.064261 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2a30-account-create-jbfsq"] Sep 30 19:14:09 crc kubenswrapper[4747]: I0930 19:14:09.074850 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a085-account-create-t62dz"] Sep 30 19:14:09 crc kubenswrapper[4747]: I0930 19:14:09.081850 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2a30-account-create-jbfsq"] Sep 30 19:14:09 crc kubenswrapper[4747]: I0930 19:14:09.106920 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d0e13c0-006b-4a57-853f-2e0cc942319a" path="/var/lib/kubelet/pods/1d0e13c0-006b-4a57-853f-2e0cc942319a/volumes" Sep 30 19:14:09 crc kubenswrapper[4747]: I0930 19:14:09.108302 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf39398-6e15-4170-9caa-b4a73cca0a46" path="/var/lib/kubelet/pods/2cf39398-6e15-4170-9caa-b4a73cca0a46/volumes" Sep 30 19:14:18 crc kubenswrapper[4747]: I0930 19:14:18.088008 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:14:18 crc kubenswrapper[4747]: E0930 19:14:18.088834 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:14:29 crc kubenswrapper[4747]: I0930 19:14:29.079555 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xnvtm"] Sep 30 19:14:29 crc kubenswrapper[4747]: I0930 19:14:29.087127 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:14:29 crc kubenswrapper[4747]: E0930 19:14:29.087513 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:14:29 crc kubenswrapper[4747]: I0930 19:14:29.107730 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-svxbf"] Sep 30 19:14:29 crc kubenswrapper[4747]: I0930 19:14:29.107780 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-bhzpm"] Sep 30 19:14:29 crc kubenswrapper[4747]: I0930 19:14:29.111807 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-bhzpm"] Sep 30 19:14:29 crc kubenswrapper[4747]: I0930 19:14:29.125321 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-svxbf"] Sep 30 19:14:29 crc kubenswrapper[4747]: I0930 19:14:29.145797 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xnvtm"] Sep 30 19:14:31 crc kubenswrapper[4747]: I0930 19:14:31.101168 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f3e5f1-0b03-499d-aa2e-1efa9d8c2154" path="/var/lib/kubelet/pods/10f3e5f1-0b03-499d-aa2e-1efa9d8c2154/volumes" Sep 30 19:14:31 crc kubenswrapper[4747]: I0930 19:14:31.102913 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="791723cb-ce23-48fc-b941-ae01909dc4a4" path="/var/lib/kubelet/pods/791723cb-ce23-48fc-b941-ae01909dc4a4/volumes" Sep 30 19:14:31 crc kubenswrapper[4747]: I0930 19:14:31.103785 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc3a0c2-38a2-41a1-90d1-992ccba7d54f" path="/var/lib/kubelet/pods/bdc3a0c2-38a2-41a1-90d1-992ccba7d54f/volumes" Sep 30 19:14:35 crc kubenswrapper[4747]: I0930 19:14:35.050379 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-w6p75"] Sep 30 19:14:35 crc kubenswrapper[4747]: I0930 19:14:35.063190 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-w6p75"] Sep 30 19:14:35 crc kubenswrapper[4747]: I0930 19:14:35.101358 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b34fa2-4ead-4998-a29d-a29d16dc9aea" path="/var/lib/kubelet/pods/19b34fa2-4ead-4998-a29d-a29d16dc9aea/volumes" Sep 30 19:14:42 crc kubenswrapper[4747]: I0930 19:14:42.087585 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:14:42 crc kubenswrapper[4747]: E0930 19:14:42.088504 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:14:47 crc kubenswrapper[4747]: I0930 19:14:47.047226 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-665b-account-create-rsr8w"] Sep 30 19:14:47 crc kubenswrapper[4747]: I0930 19:14:47.061024 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-665b-account-create-rsr8w"] Sep 30 19:14:47 crc kubenswrapper[4747]: I0930 19:14:47.106670 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd91c92e-badb-4682-b203-7889aeeef868" path="/var/lib/kubelet/pods/fd91c92e-badb-4682-b203-7889aeeef868/volumes" Sep 30 19:14:48 crc kubenswrapper[4747]: I0930 19:14:48.029400 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d52e-account-create-mb42m"] Sep 30 19:14:48 crc kubenswrapper[4747]: I0930 19:14:48.038779 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d52e-account-create-mb42m"] Sep 30 19:14:49 crc kubenswrapper[4747]: I0930 19:14:49.071297 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-h5f4l"] Sep 30 19:14:49 crc kubenswrapper[4747]: I0930 19:14:49.085123 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-h5f4l"] Sep 30 19:14:49 crc kubenswrapper[4747]: I0930 19:14:49.107509 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d86156e-2bee-4e6e-a152-5f2b92d0ed6e" path="/var/lib/kubelet/pods/2d86156e-2bee-4e6e-a152-5f2b92d0ed6e/volumes" Sep 30 19:14:49 crc kubenswrapper[4747]: I0930 19:14:49.108892 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fcd68f8-070b-4361-841f-acea0b80118f" path="/var/lib/kubelet/pods/2fcd68f8-070b-4361-841f-acea0b80118f/volumes" Sep 30 19:14:51 crc kubenswrapper[4747]: I0930 19:14:51.041031 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t55gv"] Sep 30 19:14:51 crc kubenswrapper[4747]: I0930 19:14:51.054106 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t55gv"] Sep 30 19:14:51 crc kubenswrapper[4747]: I0930 19:14:51.102045 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="922b2387-c911-4696-aa99-9dd3e51640f8" path="/var/lib/kubelet/pods/922b2387-c911-4696-aa99-9dd3e51640f8/volumes" Sep 30 19:14:57 crc kubenswrapper[4747]: I0930 19:14:57.088007 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:14:57 crc kubenswrapper[4747]: E0930 19:14:57.089080 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.170511 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls"] Sep 30 19:15:00 crc kubenswrapper[4747]: E0930 19:15:00.171077 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerName="registry-server" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.171098 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerName="registry-server" Sep 30 19:15:00 crc kubenswrapper[4747]: E0930 19:15:00.171132 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerName="extract-content" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.171144 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerName="extract-content" Sep 30 19:15:00 crc kubenswrapper[4747]: E0930 19:15:00.171182 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerName="extract-utilities" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.171195 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerName="extract-utilities" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.171513 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb42bb1a-2802-4841-84b5-d81d90a1e554" containerName="registry-server" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.172421 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.182323 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.182635 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.189069 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls"] Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.301488 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-secret-volume\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.301803 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqbd\" (UniqueName: \"kubernetes.io/projected/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-kube-api-access-wrqbd\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.301914 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-config-volume\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.403882 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqbd\" (UniqueName: \"kubernetes.io/projected/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-kube-api-access-wrqbd\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.404005 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-config-volume\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.404033 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-secret-volume\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.405315 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-config-volume\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.413393 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-secret-volume\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.430701 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqbd\" (UniqueName: \"kubernetes.io/projected/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-kube-api-access-wrqbd\") pod \"collect-profiles-29320995-qpzls\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.507816 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:00 crc kubenswrapper[4747]: I0930 19:15:00.820743 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls"] Sep 30 19:15:00 crc kubenswrapper[4747]: W0930 19:15:00.828284 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5860d576_4d4f_49aa_b5ab_778f0ae0e6ec.slice/crio-91ce2d43d8568beaac74d01302503a4a3ed32bbc23e4009858a2cb4a30315c53 WatchSource:0}: Error finding container 91ce2d43d8568beaac74d01302503a4a3ed32bbc23e4009858a2cb4a30315c53: Status 404 returned error can't find the container with id 91ce2d43d8568beaac74d01302503a4a3ed32bbc23e4009858a2cb4a30315c53 Sep 30 19:15:01 crc kubenswrapper[4747]: I0930 19:15:01.474852 4747 generic.go:334] "Generic (PLEG): container finished" podID="5860d576-4d4f-49aa-b5ab-778f0ae0e6ec" containerID="b64ef81a83dfb7e52d1a888147d6afd1bdd3a493876db2c74fa4ea877c7a3d1a" exitCode=0 Sep 30 19:15:01 crc kubenswrapper[4747]: I0930 19:15:01.475020 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" event={"ID":"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec","Type":"ContainerDied","Data":"b64ef81a83dfb7e52d1a888147d6afd1bdd3a493876db2c74fa4ea877c7a3d1a"} Sep 30 19:15:01 crc kubenswrapper[4747]: I0930 19:15:01.475265 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" event={"ID":"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec","Type":"ContainerStarted","Data":"91ce2d43d8568beaac74d01302503a4a3ed32bbc23e4009858a2cb4a30315c53"} Sep 30 19:15:02 crc kubenswrapper[4747]: I0930 19:15:02.905647 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.055103 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-secret-volume\") pod \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.055403 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrqbd\" (UniqueName: \"kubernetes.io/projected/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-kube-api-access-wrqbd\") pod \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.055465 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-config-volume\") pod \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\" (UID: \"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec\") " Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.056988 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-config-volume" (OuterVolumeSpecName: "config-volume") pod "5860d576-4d4f-49aa-b5ab-778f0ae0e6ec" (UID: "5860d576-4d4f-49aa-b5ab-778f0ae0e6ec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.064373 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-kube-api-access-wrqbd" (OuterVolumeSpecName: "kube-api-access-wrqbd") pod "5860d576-4d4f-49aa-b5ab-778f0ae0e6ec" (UID: "5860d576-4d4f-49aa-b5ab-778f0ae0e6ec"). InnerVolumeSpecName "kube-api-access-wrqbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.065066 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5860d576-4d4f-49aa-b5ab-778f0ae0e6ec" (UID: "5860d576-4d4f-49aa-b5ab-778f0ae0e6ec"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.157633 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrqbd\" (UniqueName: \"kubernetes.io/projected/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-kube-api-access-wrqbd\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.157674 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.157687 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5860d576-4d4f-49aa-b5ab-778f0ae0e6ec-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.511224 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" event={"ID":"5860d576-4d4f-49aa-b5ab-778f0ae0e6ec","Type":"ContainerDied","Data":"91ce2d43d8568beaac74d01302503a4a3ed32bbc23e4009858a2cb4a30315c53"} Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.511284 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ce2d43d8568beaac74d01302503a4a3ed32bbc23e4009858a2cb4a30315c53" Sep 30 19:15:03 crc kubenswrapper[4747]: I0930 19:15:03.511315 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320995-qpzls" Sep 30 19:15:08 crc kubenswrapper[4747]: I0930 19:15:08.087786 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:15:08 crc kubenswrapper[4747]: E0930 19:15:08.090453 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:15:08 crc kubenswrapper[4747]: I0930 19:15:08.864403 4747 scope.go:117] "RemoveContainer" containerID="62a1c03b04ebeca827ef9b623107b09329c40d008cc6c30a0f7baf1ecc74b7fc" Sep 30 19:15:08 crc kubenswrapper[4747]: I0930 19:15:08.929025 4747 scope.go:117] "RemoveContainer" containerID="9e1be8fff257effce4ab7759cca64b5ade8b8616626e59d082b777ec812c7847" Sep 30 19:15:08 crc kubenswrapper[4747]: I0930 19:15:08.971548 4747 scope.go:117] "RemoveContainer" containerID="9b5db20853331d96d46333e42566488a9f8d438ab6332f0b4743e7ea153f7c03" Sep 30 19:15:09 crc kubenswrapper[4747]: I0930 19:15:09.034446 4747 scope.go:117] "RemoveContainer" containerID="44e389b786a338dc68ff82df392cf6560c31ee2c8bc2946f421aca101f5b29ce" Sep 30 19:15:09 crc kubenswrapper[4747]: I0930 19:15:09.080126 4747 scope.go:117] "RemoveContainer" containerID="f94116fc3f08a21c17ccaad0950a285bf4b9d0c575cb327eeb29c2d4fda74417" Sep 30 19:15:09 crc kubenswrapper[4747]: I0930 19:15:09.116601 4747 scope.go:117] "RemoveContainer" containerID="58b13995015ed034b97597e203150eb92685a2311a843c8262ce841153e6ef65" Sep 30 19:15:09 crc kubenswrapper[4747]: I0930 19:15:09.141510 4747 scope.go:117] "RemoveContainer" containerID="21eab6f41fbe85fcf87a21d10eed0608c9dfc6ee065b6b32e058b30e3c4add5e" Sep 30 19:15:09 crc kubenswrapper[4747]: I0930 19:15:09.162515 4747 scope.go:117] "RemoveContainer" containerID="05f707f4bf554b46117c9ba81bc97f84f57dbf0bc279af3831d324ad28849bdf" Sep 30 19:15:09 crc kubenswrapper[4747]: I0930 19:15:09.180406 4747 scope.go:117] "RemoveContainer" containerID="624e3b52353e8b7d95c206eb9c83eac38ee7ad1e796141ad957a0ffea31cd384" Sep 30 19:15:09 crc kubenswrapper[4747]: I0930 19:15:09.205385 4747 scope.go:117] "RemoveContainer" containerID="53ef9f7f2354a088199bd3d70ce65e283dd3d75aa068387ff9c07d8c2bea465d" Sep 30 19:15:10 crc kubenswrapper[4747]: I0930 19:15:10.050022 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vpn6w"] Sep 30 19:15:10 crc kubenswrapper[4747]: I0930 19:15:10.068564 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vpn6w"] Sep 30 19:15:11 crc kubenswrapper[4747]: I0930 19:15:11.110778 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e42307-0f32-4155-bfa3-57d4de734daa" path="/var/lib/kubelet/pods/69e42307-0f32-4155-bfa3-57d4de734daa/volumes" Sep 30 19:15:14 crc kubenswrapper[4747]: I0930 19:15:14.046577 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-kk26j"] Sep 30 19:15:14 crc kubenswrapper[4747]: I0930 19:15:14.067975 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-kk26j"] Sep 30 19:15:15 crc kubenswrapper[4747]: I0930 19:15:15.098715 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbc3a67-c4e2-4f47-9ed0-bcefdc423407" path="/var/lib/kubelet/pods/8fbc3a67-c4e2-4f47-9ed0-bcefdc423407/volumes" Sep 30 19:15:23 crc kubenswrapper[4747]: I0930 19:15:23.088472 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:15:23 crc kubenswrapper[4747]: E0930 19:15:23.089733 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:15:36 crc kubenswrapper[4747]: I0930 19:15:36.087737 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:15:36 crc kubenswrapper[4747]: E0930 19:15:36.088804 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:15:40 crc kubenswrapper[4747]: I0930 19:15:40.073462 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ssjx4"] Sep 30 19:15:40 crc kubenswrapper[4747]: I0930 19:15:40.087730 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-8lhgs"] Sep 30 19:15:40 crc kubenswrapper[4747]: I0930 19:15:40.097671 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-8lhgs"] Sep 30 19:15:40 crc kubenswrapper[4747]: I0930 19:15:40.105424 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pv4ms"] Sep 30 19:15:40 crc kubenswrapper[4747]: I0930 19:15:40.112826 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ssjx4"] Sep 30 19:15:40 crc kubenswrapper[4747]: I0930 19:15:40.119060 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pv4ms"] Sep 30 19:15:41 crc kubenswrapper[4747]: I0930 19:15:41.108954 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a20f42-1c6e-40b6-b893-c77dd9f79b01" path="/var/lib/kubelet/pods/08a20f42-1c6e-40b6-b893-c77dd9f79b01/volumes" Sep 30 19:15:41 crc kubenswrapper[4747]: I0930 19:15:41.110810 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf" path="/var/lib/kubelet/pods/2b022cfe-5c7b-401f-bc2e-6bee89cd5cdf/volumes" Sep 30 19:15:41 crc kubenswrapper[4747]: I0930 19:15:41.111396 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10981c2-a487-449d-85d6-d6cdf33815b5" path="/var/lib/kubelet/pods/f10981c2-a487-449d-85d6-d6cdf33815b5/volumes" Sep 30 19:15:46 crc kubenswrapper[4747]: I0930 19:15:46.026664 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-acf1-account-create-bq4hx"] Sep 30 19:15:46 crc kubenswrapper[4747]: I0930 19:15:46.036563 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-acf1-account-create-bq4hx"] Sep 30 19:15:47 crc kubenswrapper[4747]: I0930 19:15:47.046867 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b850-account-create-r6fql"] Sep 30 19:15:47 crc kubenswrapper[4747]: I0930 19:15:47.063236 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5920-account-create-8htm8"] Sep 30 19:15:47 crc kubenswrapper[4747]: I0930 19:15:47.076104 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5920-account-create-8htm8"] Sep 30 19:15:47 crc kubenswrapper[4747]: I0930 19:15:47.087413 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:15:47 crc kubenswrapper[4747]: E0930 19:15:47.087850 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:15:47 crc kubenswrapper[4747]: I0930 19:15:47.109698 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be78d88-d62c-4ac0-a07c-0e3e323232c0" path="/var/lib/kubelet/pods/4be78d88-d62c-4ac0-a07c-0e3e323232c0/volumes" Sep 30 19:15:47 crc kubenswrapper[4747]: I0930 19:15:47.110443 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f7e6bc-f06d-475a-8cb9-778ad0105c07" path="/var/lib/kubelet/pods/69f7e6bc-f06d-475a-8cb9-778ad0105c07/volumes" Sep 30 19:15:47 crc kubenswrapper[4747]: I0930 19:15:47.111202 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b850-account-create-r6fql"] Sep 30 19:15:49 crc kubenswrapper[4747]: I0930 19:15:49.107612 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8273640-d8ff-4820-8762-fe35091c22ff" path="/var/lib/kubelet/pods/b8273640-d8ff-4820-8762-fe35091c22ff/volumes" Sep 30 19:15:59 crc kubenswrapper[4747]: I0930 19:15:59.088444 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:15:59 crc kubenswrapper[4747]: E0930 19:15:59.089548 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:16:06 crc kubenswrapper[4747]: I0930 19:16:06.076532 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ppfv"] Sep 30 19:16:06 crc kubenswrapper[4747]: I0930 19:16:06.085456 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ppfv"] Sep 30 19:16:07 crc kubenswrapper[4747]: I0930 19:16:07.096754 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39451899-d6f8-4b5a-aa40-5e72ee92d8d7" path="/var/lib/kubelet/pods/39451899-d6f8-4b5a-aa40-5e72ee92d8d7/volumes" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.390427 4747 scope.go:117] "RemoveContainer" containerID="d6c7ecf929760a6ad906bde3a6b844b08e02434bb6f044d8e4abfe175a0d4991" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.454619 4747 scope.go:117] "RemoveContainer" containerID="47931ef5852e5d81bcdac51304cc3f06675c644c43b07d4b41032fa55314edcb" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.498581 4747 scope.go:117] "RemoveContainer" containerID="7da630a819c95600ab6a65d26a5f43bb0e2c342cce2fe5d3d9907821645c2ccb" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.549727 4747 scope.go:117] "RemoveContainer" containerID="665e4d8c3052a81434fd0b951eec2e45c3587a176b6815a8ea0e7af1db7fca63" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.571876 4747 scope.go:117] "RemoveContainer" containerID="8782fd1224eaa557958a4b9695e34272e34c2388c1c5158e5889508200fcab14" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.608051 4747 scope.go:117] "RemoveContainer" containerID="41cbfdef25dcd41346009cf5b18826b89241b94a58e75eefa0cb73a22366c021" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.637692 4747 scope.go:117] "RemoveContainer" containerID="4402083650e1f6bdbf9d9ea20c7e1e94ee24e2e60621d16fdc7db4b23c094785" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.659162 4747 scope.go:117] "RemoveContainer" containerID="4ce6f38185f13f9a9245656a867548db218454ea98e6734786ffe5229d82fe3b" Sep 30 19:16:09 crc kubenswrapper[4747]: I0930 19:16:09.685385 4747 scope.go:117] "RemoveContainer" containerID="ca8b6a89f13a5d2a6df2339f3a8a096caf7e891371d6692b121f0fd805176a77" Sep 30 19:16:12 crc kubenswrapper[4747]: I0930 19:16:12.088215 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:16:13 crc kubenswrapper[4747]: I0930 19:16:13.297260 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"3c1a5be81c807961582235eda9397736cd488865a6554919d8f86431797ba70e"} Sep 30 19:16:24 crc kubenswrapper[4747]: I0930 19:16:24.067280 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vwjq7"] Sep 30 19:16:24 crc kubenswrapper[4747]: I0930 19:16:24.074828 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vwjq7"] Sep 30 19:16:25 crc kubenswrapper[4747]: I0930 19:16:25.044752 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-krqdj"] Sep 30 19:16:25 crc kubenswrapper[4747]: I0930 19:16:25.058271 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-krqdj"] Sep 30 19:16:25 crc kubenswrapper[4747]: I0930 19:16:25.103552 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aabdf08-0969-4173-ba76-9e55ba35150a" path="/var/lib/kubelet/pods/3aabdf08-0969-4173-ba76-9e55ba35150a/volumes" Sep 30 19:16:25 crc kubenswrapper[4747]: I0930 19:16:25.104297 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ca5396-cd5d-4c87-930f-0891e66a5613" path="/var/lib/kubelet/pods/85ca5396-cd5d-4c87-930f-0891e66a5613/volumes" Sep 30 19:17:08 crc kubenswrapper[4747]: I0930 19:17:08.067006 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7nkhl"] Sep 30 19:17:08 crc kubenswrapper[4747]: I0930 19:17:08.084031 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7nkhl"] Sep 30 19:17:09 crc kubenswrapper[4747]: I0930 19:17:09.099758 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3cf400-e562-4174-ac4a-7deb1d8d0be5" path="/var/lib/kubelet/pods/9b3cf400-e562-4174-ac4a-7deb1d8d0be5/volumes" Sep 30 19:17:09 crc kubenswrapper[4747]: I0930 19:17:09.842393 4747 scope.go:117] "RemoveContainer" containerID="15f0a5e64f9e3c353b72d8e5d70bcf93ca02dc298bb54cf3639c6a8d482bcc0b" Sep 30 19:17:09 crc kubenswrapper[4747]: I0930 19:17:09.911345 4747 scope.go:117] "RemoveContainer" containerID="5f098f8104e59d3a9a4d6f5f6f08a5607f23681c771af810c45716cd2a121dca" Sep 30 19:17:09 crc kubenswrapper[4747]: I0930 19:17:09.970222 4747 scope.go:117] "RemoveContainer" containerID="5ce8c55fe8c8c4997a73a025688c8da411856bdb754446af35f9fe2f1f23976d" Sep 30 19:18:37 crc kubenswrapper[4747]: I0930 19:18:37.655788 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:18:37 crc kubenswrapper[4747]: I0930 19:18:37.656257 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:19:07 crc kubenswrapper[4747]: I0930 19:19:07.655849 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:19:07 crc kubenswrapper[4747]: I0930 19:19:07.656571 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:19:37 crc kubenswrapper[4747]: I0930 19:19:37.655840 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:19:37 crc kubenswrapper[4747]: I0930 19:19:37.657348 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:19:37 crc kubenswrapper[4747]: I0930 19:19:37.657443 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 19:19:37 crc kubenswrapper[4747]: I0930 19:19:37.658434 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c1a5be81c807961582235eda9397736cd488865a6554919d8f86431797ba70e"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:19:37 crc kubenswrapper[4747]: I0930 19:19:37.658537 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://3c1a5be81c807961582235eda9397736cd488865a6554919d8f86431797ba70e" gracePeriod=600 Sep 30 19:19:38 crc kubenswrapper[4747]: I0930 19:19:38.563275 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="3c1a5be81c807961582235eda9397736cd488865a6554919d8f86431797ba70e" exitCode=0 Sep 30 19:19:38 crc kubenswrapper[4747]: I0930 19:19:38.563314 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"3c1a5be81c807961582235eda9397736cd488865a6554919d8f86431797ba70e"} Sep 30 19:19:38 crc kubenswrapper[4747]: I0930 19:19:38.564493 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468"} Sep 30 19:19:38 crc kubenswrapper[4747]: I0930 19:19:38.564572 4747 scope.go:117] "RemoveContainer" containerID="86cf320c83bf78afb43f1a0eec99d15e1febec0ed4f329f1549c6d012a4ffa8a" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.016596 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sjmw8"] Sep 30 19:20:28 crc kubenswrapper[4747]: E0930 19:20:28.018412 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5860d576-4d4f-49aa-b5ab-778f0ae0e6ec" containerName="collect-profiles" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.018450 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="5860d576-4d4f-49aa-b5ab-778f0ae0e6ec" containerName="collect-profiles" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.018906 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="5860d576-4d4f-49aa-b5ab-778f0ae0e6ec" containerName="collect-profiles" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.021900 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.031497 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjmw8"] Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.221229 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-utilities\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.221393 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-catalog-content\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.221593 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r48cg\" (UniqueName: \"kubernetes.io/projected/502daa11-85f8-4b46-ad65-e8b08879782e-kube-api-access-r48cg\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.323445 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-utilities\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.323733 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-catalog-content\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.323795 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r48cg\" (UniqueName: \"kubernetes.io/projected/502daa11-85f8-4b46-ad65-e8b08879782e-kube-api-access-r48cg\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.324023 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-utilities\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.324304 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-catalog-content\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.347301 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r48cg\" (UniqueName: \"kubernetes.io/projected/502daa11-85f8-4b46-ad65-e8b08879782e-kube-api-access-r48cg\") pod \"certified-operators-sjmw8\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.359844 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:28 crc kubenswrapper[4747]: I0930 19:20:28.893750 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjmw8"] Sep 30 19:20:29 crc kubenswrapper[4747]: I0930 19:20:29.079128 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjmw8" event={"ID":"502daa11-85f8-4b46-ad65-e8b08879782e","Type":"ContainerStarted","Data":"acf3f328a717cdf75215b2d2558db996128c1d05b618675047ca9e4a25ba9dc7"} Sep 30 19:20:30 crc kubenswrapper[4747]: I0930 19:20:30.095903 4747 generic.go:334] "Generic (PLEG): container finished" podID="502daa11-85f8-4b46-ad65-e8b08879782e" containerID="4caf9b9db4200322b8b2c14110212df0a8bb93093717a85a20198c0dd448f351" exitCode=0 Sep 30 19:20:30 crc kubenswrapper[4747]: I0930 19:20:30.096013 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjmw8" event={"ID":"502daa11-85f8-4b46-ad65-e8b08879782e","Type":"ContainerDied","Data":"4caf9b9db4200322b8b2c14110212df0a8bb93093717a85a20198c0dd448f351"} Sep 30 19:20:30 crc kubenswrapper[4747]: I0930 19:20:30.099437 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:20:32 crc kubenswrapper[4747]: I0930 19:20:32.123142 4747 generic.go:334] "Generic (PLEG): container finished" podID="502daa11-85f8-4b46-ad65-e8b08879782e" containerID="764b22d56802092cc1a1922422c7bb88533cf596ae5ab704ea131426bdc62e52" exitCode=0 Sep 30 19:20:32 crc kubenswrapper[4747]: I0930 19:20:32.123227 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjmw8" event={"ID":"502daa11-85f8-4b46-ad65-e8b08879782e","Type":"ContainerDied","Data":"764b22d56802092cc1a1922422c7bb88533cf596ae5ab704ea131426bdc62e52"} Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.038222 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zk572"] Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.047182 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.054688 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk572"] Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.127332 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-utilities\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.127378 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-catalog-content\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.127417 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whzhs\" (UniqueName: \"kubernetes.io/projected/4e23e99c-e9c4-409d-808f-b93ab740c21a-kube-api-access-whzhs\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.135435 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjmw8" event={"ID":"502daa11-85f8-4b46-ad65-e8b08879782e","Type":"ContainerStarted","Data":"e1d4d0d27c6bc3a5379c1eece99ef81cf5de9163753e51885225dccf820e31e6"} Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.161818 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sjmw8" podStartSLOduration=3.692330698 podStartE2EDuration="6.161799601s" podCreationTimestamp="2025-09-30 19:20:27 +0000 UTC" firstStartedPulling="2025-09-30 19:20:30.098694105 +0000 UTC m=+2069.758174259" lastFinishedPulling="2025-09-30 19:20:32.568163048 +0000 UTC m=+2072.227643162" observedRunningTime="2025-09-30 19:20:33.156959051 +0000 UTC m=+2072.816439195" watchObservedRunningTime="2025-09-30 19:20:33.161799601 +0000 UTC m=+2072.821279715" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.229061 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-utilities\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.229145 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-catalog-content\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.229212 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whzhs\" (UniqueName: \"kubernetes.io/projected/4e23e99c-e9c4-409d-808f-b93ab740c21a-kube-api-access-whzhs\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.229459 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-utilities\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.229734 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-catalog-content\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.258106 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whzhs\" (UniqueName: \"kubernetes.io/projected/4e23e99c-e9c4-409d-808f-b93ab740c21a-kube-api-access-whzhs\") pod \"redhat-marketplace-zk572\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.381023 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.411752 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mjzdn"] Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.414098 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.420396 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mjzdn"] Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.542322 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttqwj\" (UniqueName: \"kubernetes.io/projected/66942c76-46ae-45f0-aa22-a1181589f7b1-kube-api-access-ttqwj\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.542709 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-utilities\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.542799 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-catalog-content\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.644549 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttqwj\" (UniqueName: \"kubernetes.io/projected/66942c76-46ae-45f0-aa22-a1181589f7b1-kube-api-access-ttqwj\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.644590 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-utilities\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.644681 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-catalog-content\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.645213 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-catalog-content\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.645306 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-utilities\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.662028 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttqwj\" (UniqueName: \"kubernetes.io/projected/66942c76-46ae-45f0-aa22-a1181589f7b1-kube-api-access-ttqwj\") pod \"redhat-operators-mjzdn\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.796331 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:33 crc kubenswrapper[4747]: I0930 19:20:33.904874 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk572"] Sep 30 19:20:34 crc kubenswrapper[4747]: I0930 19:20:34.061826 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mjzdn"] Sep 30 19:20:34 crc kubenswrapper[4747]: W0930 19:20:34.073855 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66942c76_46ae_45f0_aa22_a1181589f7b1.slice/crio-b3dc0f55f7a4327c6cff73a27122442c98237eb31026d420960765fa350b9624 WatchSource:0}: Error finding container b3dc0f55f7a4327c6cff73a27122442c98237eb31026d420960765fa350b9624: Status 404 returned error can't find the container with id b3dc0f55f7a4327c6cff73a27122442c98237eb31026d420960765fa350b9624 Sep 30 19:20:34 crc kubenswrapper[4747]: I0930 19:20:34.142549 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjzdn" event={"ID":"66942c76-46ae-45f0-aa22-a1181589f7b1","Type":"ContainerStarted","Data":"b3dc0f55f7a4327c6cff73a27122442c98237eb31026d420960765fa350b9624"} Sep 30 19:20:34 crc kubenswrapper[4747]: I0930 19:20:34.144138 4747 generic.go:334] "Generic (PLEG): container finished" podID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerID="9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5" exitCode=0 Sep 30 19:20:34 crc kubenswrapper[4747]: I0930 19:20:34.144971 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk572" event={"ID":"4e23e99c-e9c4-409d-808f-b93ab740c21a","Type":"ContainerDied","Data":"9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5"} Sep 30 19:20:34 crc kubenswrapper[4747]: I0930 19:20:34.145084 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk572" event={"ID":"4e23e99c-e9c4-409d-808f-b93ab740c21a","Type":"ContainerStarted","Data":"dd85717f82a5d556a14270e716ed9bd6f628649ca142b415a98cf57d19c83a00"} Sep 30 19:20:34 crc kubenswrapper[4747]: E0930 19:20:34.281185 4747 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e23e99c_e9c4_409d_808f_b93ab740c21a.slice/crio-9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e23e99c_e9c4_409d_808f_b93ab740c21a.slice/crio-conmon-9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5.scope\": RecentStats: unable to find data in memory cache]" Sep 30 19:20:35 crc kubenswrapper[4747]: I0930 19:20:35.156208 4747 generic.go:334] "Generic (PLEG): container finished" podID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerID="870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4" exitCode=0 Sep 30 19:20:35 crc kubenswrapper[4747]: I0930 19:20:35.156553 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjzdn" event={"ID":"66942c76-46ae-45f0-aa22-a1181589f7b1","Type":"ContainerDied","Data":"870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4"} Sep 30 19:20:36 crc kubenswrapper[4747]: I0930 19:20:36.169810 4747 generic.go:334] "Generic (PLEG): container finished" podID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerID="73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8" exitCode=0 Sep 30 19:20:36 crc kubenswrapper[4747]: I0930 19:20:36.170264 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk572" event={"ID":"4e23e99c-e9c4-409d-808f-b93ab740c21a","Type":"ContainerDied","Data":"73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8"} Sep 30 19:20:37 crc kubenswrapper[4747]: I0930 19:20:37.187293 4747 generic.go:334] "Generic (PLEG): container finished" podID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerID="6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14" exitCode=0 Sep 30 19:20:37 crc kubenswrapper[4747]: I0930 19:20:37.187412 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjzdn" event={"ID":"66942c76-46ae-45f0-aa22-a1181589f7b1","Type":"ContainerDied","Data":"6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14"} Sep 30 19:20:38 crc kubenswrapper[4747]: I0930 19:20:38.197200 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk572" event={"ID":"4e23e99c-e9c4-409d-808f-b93ab740c21a","Type":"ContainerStarted","Data":"19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376"} Sep 30 19:20:38 crc kubenswrapper[4747]: I0930 19:20:38.227119 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zk572" podStartSLOduration=3.100454161 podStartE2EDuration="6.227094249s" podCreationTimestamp="2025-09-30 19:20:32 +0000 UTC" firstStartedPulling="2025-09-30 19:20:34.145948803 +0000 UTC m=+2073.805428917" lastFinishedPulling="2025-09-30 19:20:37.272588851 +0000 UTC m=+2076.932069005" observedRunningTime="2025-09-30 19:20:38.215376591 +0000 UTC m=+2077.874856825" watchObservedRunningTime="2025-09-30 19:20:38.227094249 +0000 UTC m=+2077.886574403" Sep 30 19:20:38 crc kubenswrapper[4747]: I0930 19:20:38.360442 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:38 crc kubenswrapper[4747]: I0930 19:20:38.360501 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:38 crc kubenswrapper[4747]: I0930 19:20:38.426834 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:39 crc kubenswrapper[4747]: I0930 19:20:39.208482 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjzdn" event={"ID":"66942c76-46ae-45f0-aa22-a1181589f7b1","Type":"ContainerStarted","Data":"f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9"} Sep 30 19:20:39 crc kubenswrapper[4747]: I0930 19:20:39.238278 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mjzdn" podStartSLOduration=2.7432657110000003 podStartE2EDuration="6.23825893s" podCreationTimestamp="2025-09-30 19:20:33 +0000 UTC" firstStartedPulling="2025-09-30 19:20:35.174401352 +0000 UTC m=+2074.833881476" lastFinishedPulling="2025-09-30 19:20:38.669394561 +0000 UTC m=+2078.328874695" observedRunningTime="2025-09-30 19:20:39.232514895 +0000 UTC m=+2078.891995009" watchObservedRunningTime="2025-09-30 19:20:39.23825893 +0000 UTC m=+2078.897739044" Sep 30 19:20:39 crc kubenswrapper[4747]: I0930 19:20:39.282507 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:42 crc kubenswrapper[4747]: I0930 19:20:42.190292 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjmw8"] Sep 30 19:20:42 crc kubenswrapper[4747]: I0930 19:20:42.191224 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sjmw8" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" containerName="registry-server" containerID="cri-o://e1d4d0d27c6bc3a5379c1eece99ef81cf5de9163753e51885225dccf820e31e6" gracePeriod=2 Sep 30 19:20:43 crc kubenswrapper[4747]: I0930 19:20:43.381505 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:43 crc kubenswrapper[4747]: I0930 19:20:43.382550 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:43 crc kubenswrapper[4747]: I0930 19:20:43.468882 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:43 crc kubenswrapper[4747]: I0930 19:20:43.796809 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:43 crc kubenswrapper[4747]: I0930 19:20:43.797463 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.282863 4747 generic.go:334] "Generic (PLEG): container finished" podID="502daa11-85f8-4b46-ad65-e8b08879782e" containerID="e1d4d0d27c6bc3a5379c1eece99ef81cf5de9163753e51885225dccf820e31e6" exitCode=0 Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.284055 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjmw8" event={"ID":"502daa11-85f8-4b46-ad65-e8b08879782e","Type":"ContainerDied","Data":"e1d4d0d27c6bc3a5379c1eece99ef81cf5de9163753e51885225dccf820e31e6"} Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.284086 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjmw8" event={"ID":"502daa11-85f8-4b46-ad65-e8b08879782e","Type":"ContainerDied","Data":"acf3f328a717cdf75215b2d2558db996128c1d05b618675047ca9e4a25ba9dc7"} Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.284098 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf3f328a717cdf75215b2d2558db996128c1d05b618675047ca9e4a25ba9dc7" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.318228 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.350237 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.479325 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-utilities\") pod \"502daa11-85f8-4b46-ad65-e8b08879782e\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.479390 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-catalog-content\") pod \"502daa11-85f8-4b46-ad65-e8b08879782e\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.480020 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-utilities" (OuterVolumeSpecName: "utilities") pod "502daa11-85f8-4b46-ad65-e8b08879782e" (UID: "502daa11-85f8-4b46-ad65-e8b08879782e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.487333 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r48cg\" (UniqueName: \"kubernetes.io/projected/502daa11-85f8-4b46-ad65-e8b08879782e-kube-api-access-r48cg\") pod \"502daa11-85f8-4b46-ad65-e8b08879782e\" (UID: \"502daa11-85f8-4b46-ad65-e8b08879782e\") " Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.488512 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.494911 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502daa11-85f8-4b46-ad65-e8b08879782e-kube-api-access-r48cg" (OuterVolumeSpecName: "kube-api-access-r48cg") pod "502daa11-85f8-4b46-ad65-e8b08879782e" (UID: "502daa11-85f8-4b46-ad65-e8b08879782e"). InnerVolumeSpecName "kube-api-access-r48cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.525010 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "502daa11-85f8-4b46-ad65-e8b08879782e" (UID: "502daa11-85f8-4b46-ad65-e8b08879782e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.591030 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r48cg\" (UniqueName: \"kubernetes.io/projected/502daa11-85f8-4b46-ad65-e8b08879782e-kube-api-access-r48cg\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.591089 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/502daa11-85f8-4b46-ad65-e8b08879782e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:44 crc kubenswrapper[4747]: I0930 19:20:44.876149 4747 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mjzdn" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="registry-server" probeResult="failure" output=< Sep 30 19:20:44 crc kubenswrapper[4747]: timeout: failed to connect service ":50051" within 1s Sep 30 19:20:44 crc kubenswrapper[4747]: > Sep 30 19:20:45 crc kubenswrapper[4747]: I0930 19:20:45.292454 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjmw8" Sep 30 19:20:45 crc kubenswrapper[4747]: I0930 19:20:45.321726 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjmw8"] Sep 30 19:20:45 crc kubenswrapper[4747]: I0930 19:20:45.335235 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sjmw8"] Sep 30 19:20:45 crc kubenswrapper[4747]: I0930 19:20:45.595674 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk572"] Sep 30 19:20:46 crc kubenswrapper[4747]: I0930 19:20:46.303481 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zk572" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerName="registry-server" containerID="cri-o://19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376" gracePeriod=2 Sep 30 19:20:46 crc kubenswrapper[4747]: I0930 19:20:46.836602 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.037282 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whzhs\" (UniqueName: \"kubernetes.io/projected/4e23e99c-e9c4-409d-808f-b93ab740c21a-kube-api-access-whzhs\") pod \"4e23e99c-e9c4-409d-808f-b93ab740c21a\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.037444 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-utilities\") pod \"4e23e99c-e9c4-409d-808f-b93ab740c21a\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.037567 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-catalog-content\") pod \"4e23e99c-e9c4-409d-808f-b93ab740c21a\" (UID: \"4e23e99c-e9c4-409d-808f-b93ab740c21a\") " Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.038526 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-utilities" (OuterVolumeSpecName: "utilities") pod "4e23e99c-e9c4-409d-808f-b93ab740c21a" (UID: "4e23e99c-e9c4-409d-808f-b93ab740c21a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.047551 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e23e99c-e9c4-409d-808f-b93ab740c21a-kube-api-access-whzhs" (OuterVolumeSpecName: "kube-api-access-whzhs") pod "4e23e99c-e9c4-409d-808f-b93ab740c21a" (UID: "4e23e99c-e9c4-409d-808f-b93ab740c21a"). InnerVolumeSpecName "kube-api-access-whzhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.054321 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e23e99c-e9c4-409d-808f-b93ab740c21a" (UID: "4e23e99c-e9c4-409d-808f-b93ab740c21a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.097210 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" path="/var/lib/kubelet/pods/502daa11-85f8-4b46-ad65-e8b08879782e/volumes" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.140638 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.141590 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e23e99c-e9c4-409d-808f-b93ab740c21a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.141688 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whzhs\" (UniqueName: \"kubernetes.io/projected/4e23e99c-e9c4-409d-808f-b93ab740c21a-kube-api-access-whzhs\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.317625 4747 generic.go:334] "Generic (PLEG): container finished" podID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerID="19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376" exitCode=0 Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.317725 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zk572" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.317758 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk572" event={"ID":"4e23e99c-e9c4-409d-808f-b93ab740c21a","Type":"ContainerDied","Data":"19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376"} Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.318393 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zk572" event={"ID":"4e23e99c-e9c4-409d-808f-b93ab740c21a","Type":"ContainerDied","Data":"dd85717f82a5d556a14270e716ed9bd6f628649ca142b415a98cf57d19c83a00"} Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.318437 4747 scope.go:117] "RemoveContainer" containerID="19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.350604 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk572"] Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.362304 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zk572"] Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.369951 4747 scope.go:117] "RemoveContainer" containerID="73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.395959 4747 scope.go:117] "RemoveContainer" containerID="9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.451479 4747 scope.go:117] "RemoveContainer" containerID="19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376" Sep 30 19:20:47 crc kubenswrapper[4747]: E0930 19:20:47.452248 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376\": container with ID starting with 19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376 not found: ID does not exist" containerID="19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.452289 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376"} err="failed to get container status \"19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376\": rpc error: code = NotFound desc = could not find container \"19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376\": container with ID starting with 19862a64c055e113f0e724863c8505dc7ea46e57902c8409055f1f982f7a7376 not found: ID does not exist" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.452319 4747 scope.go:117] "RemoveContainer" containerID="73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8" Sep 30 19:20:47 crc kubenswrapper[4747]: E0930 19:20:47.452798 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8\": container with ID starting with 73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8 not found: ID does not exist" containerID="73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.452828 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8"} err="failed to get container status \"73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8\": rpc error: code = NotFound desc = could not find container \"73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8\": container with ID starting with 73d547f4c1a96e21599efc33cd1a58e527931c2a33fb150f487c2a8fdf1a8bd8 not found: ID does not exist" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.452847 4747 scope.go:117] "RemoveContainer" containerID="9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5" Sep 30 19:20:47 crc kubenswrapper[4747]: E0930 19:20:47.453805 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5\": container with ID starting with 9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5 not found: ID does not exist" containerID="9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5" Sep 30 19:20:47 crc kubenswrapper[4747]: I0930 19:20:47.453830 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5"} err="failed to get container status \"9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5\": rpc error: code = NotFound desc = could not find container \"9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5\": container with ID starting with 9d30807fd833877e9aae3f1f602334c8332166b9108dcd4b8980a8ce2d4d1eb5 not found: ID does not exist" Sep 30 19:20:49 crc kubenswrapper[4747]: I0930 19:20:49.105798 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" path="/var/lib/kubelet/pods/4e23e99c-e9c4-409d-808f-b93ab740c21a/volumes" Sep 30 19:20:53 crc kubenswrapper[4747]: I0930 19:20:53.883709 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:53 crc kubenswrapper[4747]: I0930 19:20:53.954068 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:54 crc kubenswrapper[4747]: I0930 19:20:54.140726 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mjzdn"] Sep 30 19:20:55 crc kubenswrapper[4747]: I0930 19:20:55.412841 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mjzdn" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="registry-server" containerID="cri-o://f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9" gracePeriod=2 Sep 30 19:20:55 crc kubenswrapper[4747]: I0930 19:20:55.985306 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.149087 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttqwj\" (UniqueName: \"kubernetes.io/projected/66942c76-46ae-45f0-aa22-a1181589f7b1-kube-api-access-ttqwj\") pod \"66942c76-46ae-45f0-aa22-a1181589f7b1\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.149166 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-utilities\") pod \"66942c76-46ae-45f0-aa22-a1181589f7b1\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.149256 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-catalog-content\") pod \"66942c76-46ae-45f0-aa22-a1181589f7b1\" (UID: \"66942c76-46ae-45f0-aa22-a1181589f7b1\") " Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.150913 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-utilities" (OuterVolumeSpecName: "utilities") pod "66942c76-46ae-45f0-aa22-a1181589f7b1" (UID: "66942c76-46ae-45f0-aa22-a1181589f7b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.158061 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66942c76-46ae-45f0-aa22-a1181589f7b1-kube-api-access-ttqwj" (OuterVolumeSpecName: "kube-api-access-ttqwj") pod "66942c76-46ae-45f0-aa22-a1181589f7b1" (UID: "66942c76-46ae-45f0-aa22-a1181589f7b1"). InnerVolumeSpecName "kube-api-access-ttqwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.252670 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttqwj\" (UniqueName: \"kubernetes.io/projected/66942c76-46ae-45f0-aa22-a1181589f7b1-kube-api-access-ttqwj\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.252743 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.264561 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66942c76-46ae-45f0-aa22-a1181589f7b1" (UID: "66942c76-46ae-45f0-aa22-a1181589f7b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.356769 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66942c76-46ae-45f0-aa22-a1181589f7b1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.426661 4747 generic.go:334] "Generic (PLEG): container finished" podID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerID="f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9" exitCode=0 Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.426713 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjzdn" event={"ID":"66942c76-46ae-45f0-aa22-a1181589f7b1","Type":"ContainerDied","Data":"f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9"} Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.427278 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjzdn" event={"ID":"66942c76-46ae-45f0-aa22-a1181589f7b1","Type":"ContainerDied","Data":"b3dc0f55f7a4327c6cff73a27122442c98237eb31026d420960765fa350b9624"} Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.426773 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjzdn" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.427317 4747 scope.go:117] "RemoveContainer" containerID="f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.460827 4747 scope.go:117] "RemoveContainer" containerID="6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.485353 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mjzdn"] Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.497219 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mjzdn"] Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.507981 4747 scope.go:117] "RemoveContainer" containerID="870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.530561 4747 scope.go:117] "RemoveContainer" containerID="f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9" Sep 30 19:20:56 crc kubenswrapper[4747]: E0930 19:20:56.531131 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9\": container with ID starting with f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9 not found: ID does not exist" containerID="f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.531186 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9"} err="failed to get container status \"f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9\": rpc error: code = NotFound desc = could not find container \"f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9\": container with ID starting with f8ab27ae5fcedcd081e43cdab8a0069c4bb3882a52cb8ae2e61e87c59b1af8a9 not found: ID does not exist" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.531220 4747 scope.go:117] "RemoveContainer" containerID="6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14" Sep 30 19:20:56 crc kubenswrapper[4747]: E0930 19:20:56.531644 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14\": container with ID starting with 6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14 not found: ID does not exist" containerID="6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.531684 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14"} err="failed to get container status \"6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14\": rpc error: code = NotFound desc = could not find container \"6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14\": container with ID starting with 6c3f456ec5a5b8109f649bc815f6474ad8c223abc3028df53296d55e248dfc14 not found: ID does not exist" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.531710 4747 scope.go:117] "RemoveContainer" containerID="870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4" Sep 30 19:20:56 crc kubenswrapper[4747]: E0930 19:20:56.532222 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4\": container with ID starting with 870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4 not found: ID does not exist" containerID="870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4" Sep 30 19:20:56 crc kubenswrapper[4747]: I0930 19:20:56.532261 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4"} err="failed to get container status \"870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4\": rpc error: code = NotFound desc = could not find container \"870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4\": container with ID starting with 870218bbb0dfdc70e15e9f30de040bbe6329157b51c0da4a139d32f38c55d2a4 not found: ID does not exist" Sep 30 19:20:57 crc kubenswrapper[4747]: I0930 19:20:57.116991 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" path="/var/lib/kubelet/pods/66942c76-46ae-45f0-aa22-a1181589f7b1/volumes" Sep 30 19:22:07 crc kubenswrapper[4747]: I0930 19:22:07.656262 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:22:07 crc kubenswrapper[4747]: I0930 19:22:07.657056 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:22:37 crc kubenswrapper[4747]: I0930 19:22:37.655962 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:22:37 crc kubenswrapper[4747]: I0930 19:22:37.656593 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.655619 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.656589 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.656679 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.657802 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.657972 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" gracePeriod=600 Sep 30 19:23:07 crc kubenswrapper[4747]: E0930 19:23:07.794356 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.839438 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" exitCode=0 Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.839486 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468"} Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.839521 4747 scope.go:117] "RemoveContainer" containerID="3c1a5be81c807961582235eda9397736cd488865a6554919d8f86431797ba70e" Sep 30 19:23:07 crc kubenswrapper[4747]: I0930 19:23:07.840111 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:23:07 crc kubenswrapper[4747]: E0930 19:23:07.840358 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:23:19 crc kubenswrapper[4747]: I0930 19:23:19.087083 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:23:19 crc kubenswrapper[4747]: E0930 19:23:19.088511 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:23:31 crc kubenswrapper[4747]: I0930 19:23:31.097803 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:23:31 crc kubenswrapper[4747]: E0930 19:23:31.099074 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:23:43 crc kubenswrapper[4747]: I0930 19:23:43.087338 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:23:43 crc kubenswrapper[4747]: E0930 19:23:43.088138 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:23:58 crc kubenswrapper[4747]: I0930 19:23:58.086884 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:23:58 crc kubenswrapper[4747]: E0930 19:23:58.087724 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:24:11 crc kubenswrapper[4747]: I0930 19:24:11.093412 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:24:11 crc kubenswrapper[4747]: E0930 19:24:11.095341 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:24:23 crc kubenswrapper[4747]: I0930 19:24:23.087637 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:24:23 crc kubenswrapper[4747]: E0930 19:24:23.090064 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:24:36 crc kubenswrapper[4747]: I0930 19:24:36.087823 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:24:36 crc kubenswrapper[4747]: E0930 19:24:36.088902 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:24:51 crc kubenswrapper[4747]: I0930 19:24:51.092262 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:24:51 crc kubenswrapper[4747]: E0930 19:24:51.093163 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:25:05 crc kubenswrapper[4747]: I0930 19:25:05.088580 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:25:05 crc kubenswrapper[4747]: E0930 19:25:05.089821 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:25:20 crc kubenswrapper[4747]: I0930 19:25:20.087319 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:25:20 crc kubenswrapper[4747]: E0930 19:25:20.088499 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:25:33 crc kubenswrapper[4747]: I0930 19:25:33.087738 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:25:33 crc kubenswrapper[4747]: E0930 19:25:33.088922 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:25:45 crc kubenswrapper[4747]: I0930 19:25:45.087752 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:25:45 crc kubenswrapper[4747]: E0930 19:25:45.088837 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:25:59 crc kubenswrapper[4747]: I0930 19:25:59.087048 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:25:59 crc kubenswrapper[4747]: E0930 19:25:59.087996 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:26:10 crc kubenswrapper[4747]: I0930 19:26:10.088423 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:26:10 crc kubenswrapper[4747]: E0930 19:26:10.089673 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:26:25 crc kubenswrapper[4747]: I0930 19:26:25.089029 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:26:25 crc kubenswrapper[4747]: E0930 19:26:25.090214 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:26:38 crc kubenswrapper[4747]: I0930 19:26:38.088141 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:26:38 crc kubenswrapper[4747]: E0930 19:26:38.091120 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:26:52 crc kubenswrapper[4747]: I0930 19:26:52.088206 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:26:52 crc kubenswrapper[4747]: E0930 19:26:52.089010 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:27:06 crc kubenswrapper[4747]: I0930 19:27:06.087240 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:27:06 crc kubenswrapper[4747]: E0930 19:27:06.088455 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:27:10 crc kubenswrapper[4747]: I0930 19:27:10.332481 4747 scope.go:117] "RemoveContainer" containerID="e1d4d0d27c6bc3a5379c1eece99ef81cf5de9163753e51885225dccf820e31e6" Sep 30 19:27:10 crc kubenswrapper[4747]: I0930 19:27:10.365529 4747 scope.go:117] "RemoveContainer" containerID="4caf9b9db4200322b8b2c14110212df0a8bb93093717a85a20198c0dd448f351" Sep 30 19:27:10 crc kubenswrapper[4747]: I0930 19:27:10.397453 4747 scope.go:117] "RemoveContainer" containerID="764b22d56802092cc1a1922422c7bb88533cf596ae5ab704ea131426bdc62e52" Sep 30 19:27:19 crc kubenswrapper[4747]: I0930 19:27:19.088628 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:27:19 crc kubenswrapper[4747]: E0930 19:27:19.090371 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:27:33 crc kubenswrapper[4747]: I0930 19:27:33.087975 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:27:33 crc kubenswrapper[4747]: E0930 19:27:33.089050 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:27:48 crc kubenswrapper[4747]: I0930 19:27:48.088560 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:27:48 crc kubenswrapper[4747]: E0930 19:27:48.090058 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:28:00 crc kubenswrapper[4747]: I0930 19:28:00.088247 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:28:00 crc kubenswrapper[4747]: E0930 19:28:00.089492 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:28:13 crc kubenswrapper[4747]: I0930 19:28:13.088240 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:28:14 crc kubenswrapper[4747]: I0930 19:28:14.216849 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"649c3d2d5401711a3ee99e315af9ba5a91ac1461a0e9179d23cb705857ad9a4e"} Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.125706 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j2nqn"] Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126605 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerName="extract-utilities" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126620 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerName="extract-utilities" Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126635 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" containerName="extract-utilities" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126644 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" containerName="extract-utilities" Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126655 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="extract-utilities" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126663 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="extract-utilities" Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126690 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" containerName="extract-content" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126698 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" containerName="extract-content" Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126713 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerName="extract-content" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126721 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerName="extract-content" Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126739 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126747 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126761 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="extract-content" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126770 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="extract-content" Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126785 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126794 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: E0930 19:29:19.126811 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.126820 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.127254 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="502daa11-85f8-4b46-ad65-e8b08879782e" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.127281 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="66942c76-46ae-45f0-aa22-a1181589f7b1" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.127293 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e23e99c-e9c4-409d-808f-b93ab740c21a" containerName="registry-server" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.129268 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.141502 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j2nqn"] Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.290356 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkbbx\" (UniqueName: \"kubernetes.io/projected/904bc593-0cdb-4dac-9068-49abc507509d-kube-api-access-wkbbx\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.290417 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-catalog-content\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.290511 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-utilities\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.392474 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-utilities\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.392585 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkbbx\" (UniqueName: \"kubernetes.io/projected/904bc593-0cdb-4dac-9068-49abc507509d-kube-api-access-wkbbx\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.392621 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-catalog-content\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.393088 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-utilities\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.393139 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-catalog-content\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.411164 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkbbx\" (UniqueName: \"kubernetes.io/projected/904bc593-0cdb-4dac-9068-49abc507509d-kube-api-access-wkbbx\") pod \"community-operators-j2nqn\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.480029 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:19 crc kubenswrapper[4747]: I0930 19:29:19.962910 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j2nqn"] Sep 30 19:29:19 crc kubenswrapper[4747]: W0930 19:29:19.970621 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod904bc593_0cdb_4dac_9068_49abc507509d.slice/crio-5cbf47489ee8416aa63bdeeec66fe3fbfd5a94df60d286a6b0cf32d9b7dc9346 WatchSource:0}: Error finding container 5cbf47489ee8416aa63bdeeec66fe3fbfd5a94df60d286a6b0cf32d9b7dc9346: Status 404 returned error can't find the container with id 5cbf47489ee8416aa63bdeeec66fe3fbfd5a94df60d286a6b0cf32d9b7dc9346 Sep 30 19:29:20 crc kubenswrapper[4747]: I0930 19:29:20.961162 4747 generic.go:334] "Generic (PLEG): container finished" podID="904bc593-0cdb-4dac-9068-49abc507509d" containerID="516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38" exitCode=0 Sep 30 19:29:20 crc kubenswrapper[4747]: I0930 19:29:20.961280 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2nqn" event={"ID":"904bc593-0cdb-4dac-9068-49abc507509d","Type":"ContainerDied","Data":"516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38"} Sep 30 19:29:20 crc kubenswrapper[4747]: I0930 19:29:20.961868 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2nqn" event={"ID":"904bc593-0cdb-4dac-9068-49abc507509d","Type":"ContainerStarted","Data":"5cbf47489ee8416aa63bdeeec66fe3fbfd5a94df60d286a6b0cf32d9b7dc9346"} Sep 30 19:29:20 crc kubenswrapper[4747]: I0930 19:29:20.968747 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:29:21 crc kubenswrapper[4747]: I0930 19:29:21.970647 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2nqn" event={"ID":"904bc593-0cdb-4dac-9068-49abc507509d","Type":"ContainerStarted","Data":"ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac"} Sep 30 19:29:22 crc kubenswrapper[4747]: I0930 19:29:22.981980 4747 generic.go:334] "Generic (PLEG): container finished" podID="904bc593-0cdb-4dac-9068-49abc507509d" containerID="ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac" exitCode=0 Sep 30 19:29:22 crc kubenswrapper[4747]: I0930 19:29:22.982091 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2nqn" event={"ID":"904bc593-0cdb-4dac-9068-49abc507509d","Type":"ContainerDied","Data":"ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac"} Sep 30 19:29:24 crc kubenswrapper[4747]: I0930 19:29:24.003722 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2nqn" event={"ID":"904bc593-0cdb-4dac-9068-49abc507509d","Type":"ContainerStarted","Data":"31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c"} Sep 30 19:29:24 crc kubenswrapper[4747]: I0930 19:29:24.032694 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j2nqn" podStartSLOduration=2.401792103 podStartE2EDuration="5.032672523s" podCreationTimestamp="2025-09-30 19:29:19 +0000 UTC" firstStartedPulling="2025-09-30 19:29:20.968328205 +0000 UTC m=+2600.627808359" lastFinishedPulling="2025-09-30 19:29:23.599208625 +0000 UTC m=+2603.258688779" observedRunningTime="2025-09-30 19:29:24.024850369 +0000 UTC m=+2603.684330503" watchObservedRunningTime="2025-09-30 19:29:24.032672523 +0000 UTC m=+2603.692152647" Sep 30 19:29:29 crc kubenswrapper[4747]: I0930 19:29:29.480512 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:29 crc kubenswrapper[4747]: I0930 19:29:29.481134 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:29 crc kubenswrapper[4747]: I0930 19:29:29.537968 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:30 crc kubenswrapper[4747]: I0930 19:29:30.150411 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:30 crc kubenswrapper[4747]: I0930 19:29:30.227162 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j2nqn"] Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.086916 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j2nqn" podUID="904bc593-0cdb-4dac-9068-49abc507509d" containerName="registry-server" containerID="cri-o://31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c" gracePeriod=2 Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.612123 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.704366 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-catalog-content\") pod \"904bc593-0cdb-4dac-9068-49abc507509d\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.704766 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-utilities\") pod \"904bc593-0cdb-4dac-9068-49abc507509d\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.704957 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkbbx\" (UniqueName: \"kubernetes.io/projected/904bc593-0cdb-4dac-9068-49abc507509d-kube-api-access-wkbbx\") pod \"904bc593-0cdb-4dac-9068-49abc507509d\" (UID: \"904bc593-0cdb-4dac-9068-49abc507509d\") " Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.705964 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-utilities" (OuterVolumeSpecName: "utilities") pod "904bc593-0cdb-4dac-9068-49abc507509d" (UID: "904bc593-0cdb-4dac-9068-49abc507509d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.720177 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/904bc593-0cdb-4dac-9068-49abc507509d-kube-api-access-wkbbx" (OuterVolumeSpecName: "kube-api-access-wkbbx") pod "904bc593-0cdb-4dac-9068-49abc507509d" (UID: "904bc593-0cdb-4dac-9068-49abc507509d"). InnerVolumeSpecName "kube-api-access-wkbbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.789509 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "904bc593-0cdb-4dac-9068-49abc507509d" (UID: "904bc593-0cdb-4dac-9068-49abc507509d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.807062 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.807093 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkbbx\" (UniqueName: \"kubernetes.io/projected/904bc593-0cdb-4dac-9068-49abc507509d-kube-api-access-wkbbx\") on node \"crc\" DevicePath \"\"" Sep 30 19:29:32 crc kubenswrapper[4747]: I0930 19:29:32.807102 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904bc593-0cdb-4dac-9068-49abc507509d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.101140 4747 generic.go:334] "Generic (PLEG): container finished" podID="904bc593-0cdb-4dac-9068-49abc507509d" containerID="31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c" exitCode=0 Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.101252 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2nqn" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.104337 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2nqn" event={"ID":"904bc593-0cdb-4dac-9068-49abc507509d","Type":"ContainerDied","Data":"31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c"} Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.104386 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2nqn" event={"ID":"904bc593-0cdb-4dac-9068-49abc507509d","Type":"ContainerDied","Data":"5cbf47489ee8416aa63bdeeec66fe3fbfd5a94df60d286a6b0cf32d9b7dc9346"} Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.104417 4747 scope.go:117] "RemoveContainer" containerID="31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.135624 4747 scope.go:117] "RemoveContainer" containerID="ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.163891 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j2nqn"] Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.178859 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j2nqn"] Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.194100 4747 scope.go:117] "RemoveContainer" containerID="516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.236450 4747 scope.go:117] "RemoveContainer" containerID="31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c" Sep 30 19:29:33 crc kubenswrapper[4747]: E0930 19:29:33.237026 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c\": container with ID starting with 31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c not found: ID does not exist" containerID="31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.237187 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c"} err="failed to get container status \"31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c\": rpc error: code = NotFound desc = could not find container \"31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c\": container with ID starting with 31c1ff732c90cd84a01868b86b56b10215d61653d787910f84c2605ef6a6e73c not found: ID does not exist" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.237242 4747 scope.go:117] "RemoveContainer" containerID="ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac" Sep 30 19:29:33 crc kubenswrapper[4747]: E0930 19:29:33.237718 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac\": container with ID starting with ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac not found: ID does not exist" containerID="ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.237786 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac"} err="failed to get container status \"ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac\": rpc error: code = NotFound desc = could not find container \"ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac\": container with ID starting with ab9c59b7463ea5cfe207d4da1a128f0ff0df5d2daa0f3d7f8b86889e366bfcac not found: ID does not exist" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.237837 4747 scope.go:117] "RemoveContainer" containerID="516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38" Sep 30 19:29:33 crc kubenswrapper[4747]: E0930 19:29:33.238275 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38\": container with ID starting with 516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38 not found: ID does not exist" containerID="516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38" Sep 30 19:29:33 crc kubenswrapper[4747]: I0930 19:29:33.239278 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38"} err="failed to get container status \"516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38\": rpc error: code = NotFound desc = could not find container \"516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38\": container with ID starting with 516b84269772d4a65c9f7fdcc94ca7a8df17621ec484035bc213428850f97b38 not found: ID does not exist" Sep 30 19:29:35 crc kubenswrapper[4747]: I0930 19:29:35.117686 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="904bc593-0cdb-4dac-9068-49abc507509d" path="/var/lib/kubelet/pods/904bc593-0cdb-4dac-9068-49abc507509d/volumes" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.171309 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k"] Sep 30 19:30:00 crc kubenswrapper[4747]: E0930 19:30:00.174772 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="904bc593-0cdb-4dac-9068-49abc507509d" containerName="extract-utilities" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.174817 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="904bc593-0cdb-4dac-9068-49abc507509d" containerName="extract-utilities" Sep 30 19:30:00 crc kubenswrapper[4747]: E0930 19:30:00.174862 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="904bc593-0cdb-4dac-9068-49abc507509d" containerName="extract-content" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.174873 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="904bc593-0cdb-4dac-9068-49abc507509d" containerName="extract-content" Sep 30 19:30:00 crc kubenswrapper[4747]: E0930 19:30:00.174906 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="904bc593-0cdb-4dac-9068-49abc507509d" containerName="registry-server" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.174916 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="904bc593-0cdb-4dac-9068-49abc507509d" containerName="registry-server" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.177252 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="904bc593-0cdb-4dac-9068-49abc507509d" containerName="registry-server" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.179794 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.185227 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.186262 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.202755 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k"] Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.311226 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvmkd\" (UniqueName: \"kubernetes.io/projected/28df34ff-2a7c-46a5-b776-f0d321bae813-kube-api-access-lvmkd\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.311646 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28df34ff-2a7c-46a5-b776-f0d321bae813-secret-volume\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.311707 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28df34ff-2a7c-46a5-b776-f0d321bae813-config-volume\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.413364 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28df34ff-2a7c-46a5-b776-f0d321bae813-secret-volume\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.413527 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28df34ff-2a7c-46a5-b776-f0d321bae813-config-volume\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.413663 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvmkd\" (UniqueName: \"kubernetes.io/projected/28df34ff-2a7c-46a5-b776-f0d321bae813-kube-api-access-lvmkd\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.415264 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28df34ff-2a7c-46a5-b776-f0d321bae813-config-volume\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.419702 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28df34ff-2a7c-46a5-b776-f0d321bae813-secret-volume\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.436124 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvmkd\" (UniqueName: \"kubernetes.io/projected/28df34ff-2a7c-46a5-b776-f0d321bae813-kube-api-access-lvmkd\") pod \"collect-profiles-29321010-vpt9k\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.542688 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:00 crc kubenswrapper[4747]: I0930 19:30:00.859591 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k"] Sep 30 19:30:01 crc kubenswrapper[4747]: I0930 19:30:01.410561 4747 generic.go:334] "Generic (PLEG): container finished" podID="28df34ff-2a7c-46a5-b776-f0d321bae813" containerID="1e89b467d6b19a69b8c04030d4d942a52ac2f457f7de81ec09e8c841138fc749" exitCode=0 Sep 30 19:30:01 crc kubenswrapper[4747]: I0930 19:30:01.410679 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" event={"ID":"28df34ff-2a7c-46a5-b776-f0d321bae813","Type":"ContainerDied","Data":"1e89b467d6b19a69b8c04030d4d942a52ac2f457f7de81ec09e8c841138fc749"} Sep 30 19:30:01 crc kubenswrapper[4747]: I0930 19:30:01.410977 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" event={"ID":"28df34ff-2a7c-46a5-b776-f0d321bae813","Type":"ContainerStarted","Data":"24a89d8c9b375176190f92a309fd8b8a44a9cc1d17c037893fb516a0c4e719fd"} Sep 30 19:30:02 crc kubenswrapper[4747]: I0930 19:30:02.842056 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:02 crc kubenswrapper[4747]: I0930 19:30:02.963036 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvmkd\" (UniqueName: \"kubernetes.io/projected/28df34ff-2a7c-46a5-b776-f0d321bae813-kube-api-access-lvmkd\") pod \"28df34ff-2a7c-46a5-b776-f0d321bae813\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " Sep 30 19:30:02 crc kubenswrapper[4747]: I0930 19:30:02.963139 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28df34ff-2a7c-46a5-b776-f0d321bae813-config-volume\") pod \"28df34ff-2a7c-46a5-b776-f0d321bae813\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " Sep 30 19:30:02 crc kubenswrapper[4747]: I0930 19:30:02.963279 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28df34ff-2a7c-46a5-b776-f0d321bae813-secret-volume\") pod \"28df34ff-2a7c-46a5-b776-f0d321bae813\" (UID: \"28df34ff-2a7c-46a5-b776-f0d321bae813\") " Sep 30 19:30:02 crc kubenswrapper[4747]: I0930 19:30:02.965653 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28df34ff-2a7c-46a5-b776-f0d321bae813-config-volume" (OuterVolumeSpecName: "config-volume") pod "28df34ff-2a7c-46a5-b776-f0d321bae813" (UID: "28df34ff-2a7c-46a5-b776-f0d321bae813"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:30:02 crc kubenswrapper[4747]: I0930 19:30:02.970523 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28df34ff-2a7c-46a5-b776-f0d321bae813-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "28df34ff-2a7c-46a5-b776-f0d321bae813" (UID: "28df34ff-2a7c-46a5-b776-f0d321bae813"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:30:02 crc kubenswrapper[4747]: I0930 19:30:02.971890 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28df34ff-2a7c-46a5-b776-f0d321bae813-kube-api-access-lvmkd" (OuterVolumeSpecName: "kube-api-access-lvmkd") pod "28df34ff-2a7c-46a5-b776-f0d321bae813" (UID: "28df34ff-2a7c-46a5-b776-f0d321bae813"). InnerVolumeSpecName "kube-api-access-lvmkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:30:03 crc kubenswrapper[4747]: I0930 19:30:03.066033 4747 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28df34ff-2a7c-46a5-b776-f0d321bae813-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:03 crc kubenswrapper[4747]: I0930 19:30:03.066734 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvmkd\" (UniqueName: \"kubernetes.io/projected/28df34ff-2a7c-46a5-b776-f0d321bae813-kube-api-access-lvmkd\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:03 crc kubenswrapper[4747]: I0930 19:30:03.066820 4747 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28df34ff-2a7c-46a5-b776-f0d321bae813-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:30:03 crc kubenswrapper[4747]: I0930 19:30:03.434373 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" event={"ID":"28df34ff-2a7c-46a5-b776-f0d321bae813","Type":"ContainerDied","Data":"24a89d8c9b375176190f92a309fd8b8a44a9cc1d17c037893fb516a0c4e719fd"} Sep 30 19:30:03 crc kubenswrapper[4747]: I0930 19:30:03.434434 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24a89d8c9b375176190f92a309fd8b8a44a9cc1d17c037893fb516a0c4e719fd" Sep 30 19:30:03 crc kubenswrapper[4747]: I0930 19:30:03.434514 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-vpt9k" Sep 30 19:30:03 crc kubenswrapper[4747]: I0930 19:30:03.947404 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg"] Sep 30 19:30:03 crc kubenswrapper[4747]: I0930 19:30:03.954891 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320965-5lgtg"] Sep 30 19:30:05 crc kubenswrapper[4747]: I0930 19:30:05.105800 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b5cf81a-9d0e-4f3f-8596-5c1e17f87431" path="/var/lib/kubelet/pods/6b5cf81a-9d0e-4f3f-8596-5c1e17f87431/volumes" Sep 30 19:30:10 crc kubenswrapper[4747]: I0930 19:30:10.563352 4747 scope.go:117] "RemoveContainer" containerID="99a40801fd332ee6c2834af04f43342f0d34b48265ee3bc75add57c395955901" Sep 30 19:30:37 crc kubenswrapper[4747]: I0930 19:30:37.655514 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:30:37 crc kubenswrapper[4747]: I0930 19:30:37.656348 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:31:07 crc kubenswrapper[4747]: I0930 19:31:07.655668 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:31:07 crc kubenswrapper[4747]: I0930 19:31:07.656542 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.586978 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cv79d"] Sep 30 19:31:14 crc kubenswrapper[4747]: E0930 19:31:14.588235 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28df34ff-2a7c-46a5-b776-f0d321bae813" containerName="collect-profiles" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.588255 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="28df34ff-2a7c-46a5-b776-f0d321bae813" containerName="collect-profiles" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.588638 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="28df34ff-2a7c-46a5-b776-f0d321bae813" containerName="collect-profiles" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.590818 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.599358 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cv79d"] Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.704721 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4m2\" (UniqueName: \"kubernetes.io/projected/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-kube-api-access-9k4m2\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.705008 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-utilities\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.705517 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-catalog-content\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.807853 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k4m2\" (UniqueName: \"kubernetes.io/projected/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-kube-api-access-9k4m2\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.808288 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-utilities\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.808370 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-catalog-content\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.808877 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-utilities\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.808984 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-catalog-content\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.835989 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k4m2\" (UniqueName: \"kubernetes.io/projected/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-kube-api-access-9k4m2\") pod \"redhat-operators-cv79d\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:14 crc kubenswrapper[4747]: I0930 19:31:14.919272 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:15 crc kubenswrapper[4747]: I0930 19:31:15.414088 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cv79d"] Sep 30 19:31:16 crc kubenswrapper[4747]: I0930 19:31:16.248783 4747 generic.go:334] "Generic (PLEG): container finished" podID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerID="752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26" exitCode=0 Sep 30 19:31:16 crc kubenswrapper[4747]: I0930 19:31:16.249645 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv79d" event={"ID":"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5","Type":"ContainerDied","Data":"752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26"} Sep 30 19:31:16 crc kubenswrapper[4747]: I0930 19:31:16.249681 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv79d" event={"ID":"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5","Type":"ContainerStarted","Data":"c7081e85841b9ac5bd7e51b9fa0e0eb0788d3bbefa84fa86e53e655e0e09676c"} Sep 30 19:31:18 crc kubenswrapper[4747]: I0930 19:31:18.275585 4747 generic.go:334] "Generic (PLEG): container finished" podID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerID="c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d" exitCode=0 Sep 30 19:31:18 crc kubenswrapper[4747]: I0930 19:31:18.275644 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv79d" event={"ID":"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5","Type":"ContainerDied","Data":"c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d"} Sep 30 19:31:19 crc kubenswrapper[4747]: I0930 19:31:19.288170 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv79d" event={"ID":"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5","Type":"ContainerStarted","Data":"663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff"} Sep 30 19:31:19 crc kubenswrapper[4747]: I0930 19:31:19.311537 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cv79d" podStartSLOduration=2.878079031 podStartE2EDuration="5.311519052s" podCreationTimestamp="2025-09-30 19:31:14 +0000 UTC" firstStartedPulling="2025-09-30 19:31:16.253406493 +0000 UTC m=+2715.912886637" lastFinishedPulling="2025-09-30 19:31:18.686846504 +0000 UTC m=+2718.346326658" observedRunningTime="2025-09-30 19:31:19.306308803 +0000 UTC m=+2718.965788917" watchObservedRunningTime="2025-09-30 19:31:19.311519052 +0000 UTC m=+2718.970999166" Sep 30 19:31:24 crc kubenswrapper[4747]: I0930 19:31:24.919674 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:24 crc kubenswrapper[4747]: I0930 19:31:24.920411 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:24 crc kubenswrapper[4747]: I0930 19:31:24.996778 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:25 crc kubenswrapper[4747]: I0930 19:31:25.426123 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:25 crc kubenswrapper[4747]: I0930 19:31:25.492808 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cv79d"] Sep 30 19:31:27 crc kubenswrapper[4747]: I0930 19:31:27.376163 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cv79d" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerName="registry-server" containerID="cri-o://663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff" gracePeriod=2 Sep 30 19:31:27 crc kubenswrapper[4747]: I0930 19:31:27.936556 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:27 crc kubenswrapper[4747]: I0930 19:31:27.993798 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k4m2\" (UniqueName: \"kubernetes.io/projected/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-kube-api-access-9k4m2\") pod \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " Sep 30 19:31:27 crc kubenswrapper[4747]: I0930 19:31:27.993973 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-catalog-content\") pod \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " Sep 30 19:31:27 crc kubenswrapper[4747]: I0930 19:31:27.994157 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-utilities\") pod \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\" (UID: \"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5\") " Sep 30 19:31:27 crc kubenswrapper[4747]: I0930 19:31:27.995036 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-utilities" (OuterVolumeSpecName: "utilities") pod "34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" (UID: "34e2a331-8b1c-4d21-ba9f-6d4abd918ef5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.002351 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-kube-api-access-9k4m2" (OuterVolumeSpecName: "kube-api-access-9k4m2") pod "34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" (UID: "34e2a331-8b1c-4d21-ba9f-6d4abd918ef5"). InnerVolumeSpecName "kube-api-access-9k4m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.096279 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.096318 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k4m2\" (UniqueName: \"kubernetes.io/projected/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-kube-api-access-9k4m2\") on node \"crc\" DevicePath \"\"" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.391241 4747 generic.go:334] "Generic (PLEG): container finished" podID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerID="663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff" exitCode=0 Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.391299 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv79d" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.391443 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv79d" event={"ID":"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5","Type":"ContainerDied","Data":"663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff"} Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.391546 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv79d" event={"ID":"34e2a331-8b1c-4d21-ba9f-6d4abd918ef5","Type":"ContainerDied","Data":"c7081e85841b9ac5bd7e51b9fa0e0eb0788d3bbefa84fa86e53e655e0e09676c"} Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.391591 4747 scope.go:117] "RemoveContainer" containerID="663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.429317 4747 scope.go:117] "RemoveContainer" containerID="c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.459823 4747 scope.go:117] "RemoveContainer" containerID="752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.525898 4747 scope.go:117] "RemoveContainer" containerID="663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff" Sep 30 19:31:28 crc kubenswrapper[4747]: E0930 19:31:28.527832 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff\": container with ID starting with 663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff not found: ID does not exist" containerID="663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.527906 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff"} err="failed to get container status \"663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff\": rpc error: code = NotFound desc = could not find container \"663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff\": container with ID starting with 663e1e127b15a237f6d172b433a013f6a8a309c97e13ee639fc64dd2244473ff not found: ID does not exist" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.527968 4747 scope.go:117] "RemoveContainer" containerID="c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d" Sep 30 19:31:28 crc kubenswrapper[4747]: E0930 19:31:28.529642 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d\": container with ID starting with c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d not found: ID does not exist" containerID="c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.529739 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d"} err="failed to get container status \"c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d\": rpc error: code = NotFound desc = could not find container \"c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d\": container with ID starting with c350f115faa2597dd39c889a0114b5c9dce9e2bac1ae1491b1413c0e060c988d not found: ID does not exist" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.529815 4747 scope.go:117] "RemoveContainer" containerID="752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26" Sep 30 19:31:28 crc kubenswrapper[4747]: E0930 19:31:28.530611 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26\": container with ID starting with 752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26 not found: ID does not exist" containerID="752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26" Sep 30 19:31:28 crc kubenswrapper[4747]: I0930 19:31:28.530645 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26"} err="failed to get container status \"752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26\": rpc error: code = NotFound desc = could not find container \"752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26\": container with ID starting with 752ba27aba40e9819477523b72d5799934e56f9f13b87d507ddd3f094d870b26 not found: ID does not exist" Sep 30 19:31:29 crc kubenswrapper[4747]: I0930 19:31:29.292626 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" (UID: "34e2a331-8b1c-4d21-ba9f-6d4abd918ef5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:31:29 crc kubenswrapper[4747]: I0930 19:31:29.327597 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:31:29 crc kubenswrapper[4747]: I0930 19:31:29.647502 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cv79d"] Sep 30 19:31:29 crc kubenswrapper[4747]: I0930 19:31:29.658254 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cv79d"] Sep 30 19:31:31 crc kubenswrapper[4747]: I0930 19:31:31.104599 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" path="/var/lib/kubelet/pods/34e2a331-8b1c-4d21-ba9f-6d4abd918ef5/volumes" Sep 30 19:31:37 crc kubenswrapper[4747]: I0930 19:31:37.655412 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:31:37 crc kubenswrapper[4747]: I0930 19:31:37.656213 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:31:37 crc kubenswrapper[4747]: I0930 19:31:37.656287 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 19:31:37 crc kubenswrapper[4747]: I0930 19:31:37.657447 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"649c3d2d5401711a3ee99e315af9ba5a91ac1461a0e9179d23cb705857ad9a4e"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:31:37 crc kubenswrapper[4747]: I0930 19:31:37.657560 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://649c3d2d5401711a3ee99e315af9ba5a91ac1461a0e9179d23cb705857ad9a4e" gracePeriod=600 Sep 30 19:31:38 crc kubenswrapper[4747]: I0930 19:31:38.504990 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="649c3d2d5401711a3ee99e315af9ba5a91ac1461a0e9179d23cb705857ad9a4e" exitCode=0 Sep 30 19:31:38 crc kubenswrapper[4747]: I0930 19:31:38.505021 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"649c3d2d5401711a3ee99e315af9ba5a91ac1461a0e9179d23cb705857ad9a4e"} Sep 30 19:31:38 crc kubenswrapper[4747]: I0930 19:31:38.505370 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc"} Sep 30 19:31:38 crc kubenswrapper[4747]: I0930 19:31:38.505397 4747 scope.go:117] "RemoveContainer" containerID="6dd8d3cf2bf96ef6fefa6b54f0ba7cb64eecbb702ec1606410b289f0e9f15468" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.442802 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-784nb"] Sep 30 19:31:45 crc kubenswrapper[4747]: E0930 19:31:45.443721 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerName="registry-server" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.443735 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerName="registry-server" Sep 30 19:31:45 crc kubenswrapper[4747]: E0930 19:31:45.443753 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerName="extract-utilities" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.443762 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerName="extract-utilities" Sep 30 19:31:45 crc kubenswrapper[4747]: E0930 19:31:45.443798 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerName="extract-content" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.443816 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerName="extract-content" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.444034 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e2a331-8b1c-4d21-ba9f-6d4abd918ef5" containerName="registry-server" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.445629 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.453073 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-784nb"] Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.566895 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpq5k\" (UniqueName: \"kubernetes.io/projected/7c53d6ca-8633-4147-b739-32c60645976a-kube-api-access-tpq5k\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.567026 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-utilities\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.567107 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-catalog-content\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.668643 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-catalog-content\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.668749 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpq5k\" (UniqueName: \"kubernetes.io/projected/7c53d6ca-8633-4147-b739-32c60645976a-kube-api-access-tpq5k\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.668920 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-utilities\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.669209 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-catalog-content\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.669556 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-utilities\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.692275 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpq5k\" (UniqueName: \"kubernetes.io/projected/7c53d6ca-8633-4147-b739-32c60645976a-kube-api-access-tpq5k\") pod \"certified-operators-784nb\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:45 crc kubenswrapper[4747]: I0930 19:31:45.774489 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:46 crc kubenswrapper[4747]: I0930 19:31:46.253142 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-784nb"] Sep 30 19:31:46 crc kubenswrapper[4747]: W0930 19:31:46.258118 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c53d6ca_8633_4147_b739_32c60645976a.slice/crio-a160486f623c72491cc6ba409ec876b5e587f4d64890e40c2d1d222242aca6c5 WatchSource:0}: Error finding container a160486f623c72491cc6ba409ec876b5e587f4d64890e40c2d1d222242aca6c5: Status 404 returned error can't find the container with id a160486f623c72491cc6ba409ec876b5e587f4d64890e40c2d1d222242aca6c5 Sep 30 19:31:46 crc kubenswrapper[4747]: I0930 19:31:46.590429 4747 generic.go:334] "Generic (PLEG): container finished" podID="7c53d6ca-8633-4147-b739-32c60645976a" containerID="6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036" exitCode=0 Sep 30 19:31:46 crc kubenswrapper[4747]: I0930 19:31:46.590500 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784nb" event={"ID":"7c53d6ca-8633-4147-b739-32c60645976a","Type":"ContainerDied","Data":"6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036"} Sep 30 19:31:46 crc kubenswrapper[4747]: I0930 19:31:46.590780 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784nb" event={"ID":"7c53d6ca-8633-4147-b739-32c60645976a","Type":"ContainerStarted","Data":"a160486f623c72491cc6ba409ec876b5e587f4d64890e40c2d1d222242aca6c5"} Sep 30 19:31:48 crc kubenswrapper[4747]: I0930 19:31:48.645284 4747 generic.go:334] "Generic (PLEG): container finished" podID="7c53d6ca-8633-4147-b739-32c60645976a" containerID="c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48" exitCode=0 Sep 30 19:31:48 crc kubenswrapper[4747]: I0930 19:31:48.645391 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784nb" event={"ID":"7c53d6ca-8633-4147-b739-32c60645976a","Type":"ContainerDied","Data":"c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48"} Sep 30 19:31:49 crc kubenswrapper[4747]: I0930 19:31:49.662854 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784nb" event={"ID":"7c53d6ca-8633-4147-b739-32c60645976a","Type":"ContainerStarted","Data":"6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54"} Sep 30 19:31:49 crc kubenswrapper[4747]: I0930 19:31:49.700975 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-784nb" podStartSLOduration=2.237079275 podStartE2EDuration="4.700903907s" podCreationTimestamp="2025-09-30 19:31:45 +0000 UTC" firstStartedPulling="2025-09-30 19:31:46.59508666 +0000 UTC m=+2746.254566814" lastFinishedPulling="2025-09-30 19:31:49.058911282 +0000 UTC m=+2748.718391446" observedRunningTime="2025-09-30 19:31:49.69439791 +0000 UTC m=+2749.353878044" watchObservedRunningTime="2025-09-30 19:31:49.700903907 +0000 UTC m=+2749.360384051" Sep 30 19:31:55 crc kubenswrapper[4747]: I0930 19:31:55.775582 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:55 crc kubenswrapper[4747]: I0930 19:31:55.776243 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:55 crc kubenswrapper[4747]: I0930 19:31:55.834333 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:56 crc kubenswrapper[4747]: I0930 19:31:56.830042 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:56 crc kubenswrapper[4747]: I0930 19:31:56.891417 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-784nb"] Sep 30 19:31:58 crc kubenswrapper[4747]: I0930 19:31:58.779073 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-784nb" podUID="7c53d6ca-8633-4147-b739-32c60645976a" containerName="registry-server" containerID="cri-o://6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54" gracePeriod=2 Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.348176 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.475538 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-catalog-content\") pod \"7c53d6ca-8633-4147-b739-32c60645976a\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.475702 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-utilities\") pod \"7c53d6ca-8633-4147-b739-32c60645976a\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.475739 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpq5k\" (UniqueName: \"kubernetes.io/projected/7c53d6ca-8633-4147-b739-32c60645976a-kube-api-access-tpq5k\") pod \"7c53d6ca-8633-4147-b739-32c60645976a\" (UID: \"7c53d6ca-8633-4147-b739-32c60645976a\") " Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.476628 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-utilities" (OuterVolumeSpecName: "utilities") pod "7c53d6ca-8633-4147-b739-32c60645976a" (UID: "7c53d6ca-8633-4147-b739-32c60645976a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.486670 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c53d6ca-8633-4147-b739-32c60645976a-kube-api-access-tpq5k" (OuterVolumeSpecName: "kube-api-access-tpq5k") pod "7c53d6ca-8633-4147-b739-32c60645976a" (UID: "7c53d6ca-8633-4147-b739-32c60645976a"). InnerVolumeSpecName "kube-api-access-tpq5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.519917 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c53d6ca-8633-4147-b739-32c60645976a" (UID: "7c53d6ca-8633-4147-b739-32c60645976a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.578218 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.578241 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpq5k\" (UniqueName: \"kubernetes.io/projected/7c53d6ca-8633-4147-b739-32c60645976a-kube-api-access-tpq5k\") on node \"crc\" DevicePath \"\"" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.578251 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c53d6ca-8633-4147-b739-32c60645976a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.788630 4747 generic.go:334] "Generic (PLEG): container finished" podID="7c53d6ca-8633-4147-b739-32c60645976a" containerID="6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54" exitCode=0 Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.788682 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784nb" event={"ID":"7c53d6ca-8633-4147-b739-32c60645976a","Type":"ContainerDied","Data":"6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54"} Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.788995 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784nb" event={"ID":"7c53d6ca-8633-4147-b739-32c60645976a","Type":"ContainerDied","Data":"a160486f623c72491cc6ba409ec876b5e587f4d64890e40c2d1d222242aca6c5"} Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.789017 4747 scope.go:117] "RemoveContainer" containerID="6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.788732 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-784nb" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.822863 4747 scope.go:117] "RemoveContainer" containerID="c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.829883 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-784nb"] Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.836893 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-784nb"] Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.856536 4747 scope.go:117] "RemoveContainer" containerID="6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.893670 4747 scope.go:117] "RemoveContainer" containerID="6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54" Sep 30 19:31:59 crc kubenswrapper[4747]: E0930 19:31:59.894300 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54\": container with ID starting with 6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54 not found: ID does not exist" containerID="6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.894369 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54"} err="failed to get container status \"6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54\": rpc error: code = NotFound desc = could not find container \"6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54\": container with ID starting with 6627addc5820ff65b058d29a4225ba09c9a718ec070f98e6ec9209a9e853aa54 not found: ID does not exist" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.894411 4747 scope.go:117] "RemoveContainer" containerID="c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48" Sep 30 19:31:59 crc kubenswrapper[4747]: E0930 19:31:59.895277 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48\": container with ID starting with c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48 not found: ID does not exist" containerID="c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.895317 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48"} err="failed to get container status \"c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48\": rpc error: code = NotFound desc = could not find container \"c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48\": container with ID starting with c59b6e3e8171b98fce1ff87cf9a064685accc7fa7d3ca3246983a62576dbfa48 not found: ID does not exist" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.895346 4747 scope.go:117] "RemoveContainer" containerID="6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036" Sep 30 19:31:59 crc kubenswrapper[4747]: E0930 19:31:59.895736 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036\": container with ID starting with 6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036 not found: ID does not exist" containerID="6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036" Sep 30 19:31:59 crc kubenswrapper[4747]: I0930 19:31:59.895768 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036"} err="failed to get container status \"6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036\": rpc error: code = NotFound desc = could not find container \"6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036\": container with ID starting with 6b2cd94c3a8f585c35b73d1efd21d033c52f3d068a0f9d6460cb0c7a59ae6036 not found: ID does not exist" Sep 30 19:32:01 crc kubenswrapper[4747]: I0930 19:32:01.102734 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c53d6ca-8633-4147-b739-32c60645976a" path="/var/lib/kubelet/pods/7c53d6ca-8633-4147-b739-32c60645976a/volumes" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.808079 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zcr5t"] Sep 30 19:32:02 crc kubenswrapper[4747]: E0930 19:32:02.809158 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c53d6ca-8633-4147-b739-32c60645976a" containerName="registry-server" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.809177 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c53d6ca-8633-4147-b739-32c60645976a" containerName="registry-server" Sep 30 19:32:02 crc kubenswrapper[4747]: E0930 19:32:02.809201 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c53d6ca-8633-4147-b739-32c60645976a" containerName="extract-utilities" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.809211 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c53d6ca-8633-4147-b739-32c60645976a" containerName="extract-utilities" Sep 30 19:32:02 crc kubenswrapper[4747]: E0930 19:32:02.809251 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c53d6ca-8633-4147-b739-32c60645976a" containerName="extract-content" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.809262 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c53d6ca-8633-4147-b739-32c60645976a" containerName="extract-content" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.809541 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c53d6ca-8633-4147-b739-32c60645976a" containerName="registry-server" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.811458 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.828090 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcr5t"] Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.963827 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-catalog-content\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.964132 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lckvm\" (UniqueName: \"kubernetes.io/projected/05c1fa57-c433-47c5-a2a7-67ac94e69086-kube-api-access-lckvm\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:02 crc kubenswrapper[4747]: I0930 19:32:02.964270 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-utilities\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.065731 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-catalog-content\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.065777 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lckvm\" (UniqueName: \"kubernetes.io/projected/05c1fa57-c433-47c5-a2a7-67ac94e69086-kube-api-access-lckvm\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.065809 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-utilities\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.066413 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-utilities\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.066646 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-catalog-content\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.089742 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lckvm\" (UniqueName: \"kubernetes.io/projected/05c1fa57-c433-47c5-a2a7-67ac94e69086-kube-api-access-lckvm\") pod \"redhat-marketplace-zcr5t\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.142203 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.361080 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcr5t"] Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.856012 4747 generic.go:334] "Generic (PLEG): container finished" podID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerID="ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217" exitCode=0 Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.856147 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcr5t" event={"ID":"05c1fa57-c433-47c5-a2a7-67ac94e69086","Type":"ContainerDied","Data":"ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217"} Sep 30 19:32:03 crc kubenswrapper[4747]: I0930 19:32:03.856531 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcr5t" event={"ID":"05c1fa57-c433-47c5-a2a7-67ac94e69086","Type":"ContainerStarted","Data":"a6df608e5a310fe55f08dff3f82ca5727e15d9dcad7a0d3a5d757c59d83d9ac9"} Sep 30 19:32:04 crc kubenswrapper[4747]: I0930 19:32:04.868583 4747 generic.go:334] "Generic (PLEG): container finished" podID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerID="7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f" exitCode=0 Sep 30 19:32:04 crc kubenswrapper[4747]: I0930 19:32:04.868634 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcr5t" event={"ID":"05c1fa57-c433-47c5-a2a7-67ac94e69086","Type":"ContainerDied","Data":"7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f"} Sep 30 19:32:05 crc kubenswrapper[4747]: I0930 19:32:05.879487 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcr5t" event={"ID":"05c1fa57-c433-47c5-a2a7-67ac94e69086","Type":"ContainerStarted","Data":"f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb"} Sep 30 19:32:05 crc kubenswrapper[4747]: I0930 19:32:05.905286 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zcr5t" podStartSLOduration=2.480124667 podStartE2EDuration="3.905268949s" podCreationTimestamp="2025-09-30 19:32:02 +0000 UTC" firstStartedPulling="2025-09-30 19:32:03.860459972 +0000 UTC m=+2763.519940116" lastFinishedPulling="2025-09-30 19:32:05.285604274 +0000 UTC m=+2764.945084398" observedRunningTime="2025-09-30 19:32:05.898129514 +0000 UTC m=+2765.557609638" watchObservedRunningTime="2025-09-30 19:32:05.905268949 +0000 UTC m=+2765.564749073" Sep 30 19:32:13 crc kubenswrapper[4747]: I0930 19:32:13.143546 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:13 crc kubenswrapper[4747]: I0930 19:32:13.144594 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:13 crc kubenswrapper[4747]: I0930 19:32:13.201047 4747 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:14 crc kubenswrapper[4747]: I0930 19:32:14.060014 4747 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:14 crc kubenswrapper[4747]: I0930 19:32:14.130816 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcr5t"] Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.001669 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zcr5t" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerName="registry-server" containerID="cri-o://f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb" gracePeriod=2 Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.558602 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.648997 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-utilities\") pod \"05c1fa57-c433-47c5-a2a7-67ac94e69086\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.649340 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-catalog-content\") pod \"05c1fa57-c433-47c5-a2a7-67ac94e69086\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.649419 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lckvm\" (UniqueName: \"kubernetes.io/projected/05c1fa57-c433-47c5-a2a7-67ac94e69086-kube-api-access-lckvm\") pod \"05c1fa57-c433-47c5-a2a7-67ac94e69086\" (UID: \"05c1fa57-c433-47c5-a2a7-67ac94e69086\") " Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.650415 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-utilities" (OuterVolumeSpecName: "utilities") pod "05c1fa57-c433-47c5-a2a7-67ac94e69086" (UID: "05c1fa57-c433-47c5-a2a7-67ac94e69086"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.659277 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05c1fa57-c433-47c5-a2a7-67ac94e69086-kube-api-access-lckvm" (OuterVolumeSpecName: "kube-api-access-lckvm") pod "05c1fa57-c433-47c5-a2a7-67ac94e69086" (UID: "05c1fa57-c433-47c5-a2a7-67ac94e69086"). InnerVolumeSpecName "kube-api-access-lckvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.667480 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05c1fa57-c433-47c5-a2a7-67ac94e69086" (UID: "05c1fa57-c433-47c5-a2a7-67ac94e69086"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.751517 4747 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.751578 4747 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c1fa57-c433-47c5-a2a7-67ac94e69086-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:16 crc kubenswrapper[4747]: I0930 19:32:16.751596 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lckvm\" (UniqueName: \"kubernetes.io/projected/05c1fa57-c433-47c5-a2a7-67ac94e69086-kube-api-access-lckvm\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.015611 4747 generic.go:334] "Generic (PLEG): container finished" podID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerID="f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb" exitCode=0 Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.015740 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcr5t" event={"ID":"05c1fa57-c433-47c5-a2a7-67ac94e69086","Type":"ContainerDied","Data":"f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb"} Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.015771 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcr5t" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.015815 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcr5t" event={"ID":"05c1fa57-c433-47c5-a2a7-67ac94e69086","Type":"ContainerDied","Data":"a6df608e5a310fe55f08dff3f82ca5727e15d9dcad7a0d3a5d757c59d83d9ac9"} Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.015836 4747 scope.go:117] "RemoveContainer" containerID="f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.052701 4747 scope.go:117] "RemoveContainer" containerID="7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.089621 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcr5t"] Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.097347 4747 scope.go:117] "RemoveContainer" containerID="ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.169013 4747 scope.go:117] "RemoveContainer" containerID="f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb" Sep 30 19:32:17 crc kubenswrapper[4747]: E0930 19:32:17.183918 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb\": container with ID starting with f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb not found: ID does not exist" containerID="f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.183981 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb"} err="failed to get container status \"f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb\": rpc error: code = NotFound desc = could not find container \"f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb\": container with ID starting with f1c902a6372d318f7fe53e19c120cd19e8b5ef2e18742876e9d51057eaa854eb not found: ID does not exist" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.184009 4747 scope.go:117] "RemoveContainer" containerID="7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f" Sep 30 19:32:17 crc kubenswrapper[4747]: E0930 19:32:17.184578 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f\": container with ID starting with 7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f not found: ID does not exist" containerID="7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.184598 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f"} err="failed to get container status \"7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f\": rpc error: code = NotFound desc = could not find container \"7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f\": container with ID starting with 7023b7709bcee8b3880dac2fbdf620ca929723b806b9e5b87ed363c26bda2c5f not found: ID does not exist" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.184611 4747 scope.go:117] "RemoveContainer" containerID="ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217" Sep 30 19:32:17 crc kubenswrapper[4747]: E0930 19:32:17.184872 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217\": container with ID starting with ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217 not found: ID does not exist" containerID="ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.184893 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217"} err="failed to get container status \"ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217\": rpc error: code = NotFound desc = could not find container \"ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217\": container with ID starting with ee11f76d6d94c3a8ea4ede07430cc63ca918362fa3092f2e63056b9ef92db217 not found: ID does not exist" Sep 30 19:32:17 crc kubenswrapper[4747]: I0930 19:32:17.185552 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcr5t"] Sep 30 19:32:19 crc kubenswrapper[4747]: I0930 19:32:19.106920 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" path="/var/lib/kubelet/pods/05c1fa57-c433-47c5-a2a7-67ac94e69086/volumes" Sep 30 19:34:07 crc kubenswrapper[4747]: I0930 19:34:07.655989 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:34:07 crc kubenswrapper[4747]: I0930 19:34:07.657274 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.517471 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kgq9j/must-gather-89th2"] Sep 30 19:34:20 crc kubenswrapper[4747]: E0930 19:34:20.518303 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerName="extract-utilities" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.518318 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerName="extract-utilities" Sep 30 19:34:20 crc kubenswrapper[4747]: E0930 19:34:20.518352 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerName="extract-content" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.518361 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerName="extract-content" Sep 30 19:34:20 crc kubenswrapper[4747]: E0930 19:34:20.518380 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerName="registry-server" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.518389 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerName="registry-server" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.518605 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="05c1fa57-c433-47c5-a2a7-67ac94e69086" containerName="registry-server" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.519754 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.521756 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kgq9j"/"openshift-service-ca.crt" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.525171 4747 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kgq9j"/"kube-root-ca.crt" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.525325 4747 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kgq9j"/"default-dockercfg-xgspz" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.528435 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kgq9j/must-gather-89th2"] Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.642116 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgmfh\" (UniqueName: \"kubernetes.io/projected/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-kube-api-access-fgmfh\") pod \"must-gather-89th2\" (UID: \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\") " pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.642356 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-must-gather-output\") pod \"must-gather-89th2\" (UID: \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\") " pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.743975 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-must-gather-output\") pod \"must-gather-89th2\" (UID: \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\") " pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.744103 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgmfh\" (UniqueName: \"kubernetes.io/projected/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-kube-api-access-fgmfh\") pod \"must-gather-89th2\" (UID: \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\") " pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.744678 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-must-gather-output\") pod \"must-gather-89th2\" (UID: \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\") " pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.763159 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgmfh\" (UniqueName: \"kubernetes.io/projected/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-kube-api-access-fgmfh\") pod \"must-gather-89th2\" (UID: \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\") " pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:34:20 crc kubenswrapper[4747]: I0930 19:34:20.848891 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:34:21 crc kubenswrapper[4747]: I0930 19:34:21.070028 4747 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kgq9j/must-gather-89th2"] Sep 30 19:34:21 crc kubenswrapper[4747]: I0930 19:34:21.078311 4747 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:34:21 crc kubenswrapper[4747]: I0930 19:34:21.388760 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/must-gather-89th2" event={"ID":"33174d61-4dee-47d5-b4b9-bf03d6e2deeb","Type":"ContainerStarted","Data":"5987673688ddabb4308ef2c56b821a28092c8faf74873b161066d4bef01c8ce1"} Sep 30 19:34:25 crc kubenswrapper[4747]: I0930 19:34:25.426991 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/must-gather-89th2" event={"ID":"33174d61-4dee-47d5-b4b9-bf03d6e2deeb","Type":"ContainerStarted","Data":"030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad"} Sep 30 19:34:26 crc kubenswrapper[4747]: I0930 19:34:26.435604 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/must-gather-89th2" event={"ID":"33174d61-4dee-47d5-b4b9-bf03d6e2deeb","Type":"ContainerStarted","Data":"65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36"} Sep 30 19:34:26 crc kubenswrapper[4747]: I0930 19:34:26.465304 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kgq9j/must-gather-89th2" podStartSLOduration=2.433194184 podStartE2EDuration="6.46527491s" podCreationTimestamp="2025-09-30 19:34:20 +0000 UTC" firstStartedPulling="2025-09-30 19:34:21.078132425 +0000 UTC m=+2900.737612539" lastFinishedPulling="2025-09-30 19:34:25.110213161 +0000 UTC m=+2904.769693265" observedRunningTime="2025-09-30 19:34:26.457487247 +0000 UTC m=+2906.116967371" watchObservedRunningTime="2025-09-30 19:34:26.46527491 +0000 UTC m=+2906.124755064" Sep 30 19:34:28 crc kubenswrapper[4747]: I0930 19:34:28.951093 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-bjxcr"] Sep 30 19:34:28 crc kubenswrapper[4747]: I0930 19:34:28.952446 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:34:29 crc kubenswrapper[4747]: I0930 19:34:29.114653 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b7473f3-c7c2-418f-93d9-51b3a98ee670-host\") pod \"crc-debug-bjxcr\" (UID: \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\") " pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:34:29 crc kubenswrapper[4747]: I0930 19:34:29.115039 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9v7h\" (UniqueName: \"kubernetes.io/projected/9b7473f3-c7c2-418f-93d9-51b3a98ee670-kube-api-access-r9v7h\") pod \"crc-debug-bjxcr\" (UID: \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\") " pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:34:29 crc kubenswrapper[4747]: I0930 19:34:29.216572 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b7473f3-c7c2-418f-93d9-51b3a98ee670-host\") pod \"crc-debug-bjxcr\" (UID: \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\") " pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:34:29 crc kubenswrapper[4747]: I0930 19:34:29.216835 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9v7h\" (UniqueName: \"kubernetes.io/projected/9b7473f3-c7c2-418f-93d9-51b3a98ee670-kube-api-access-r9v7h\") pod \"crc-debug-bjxcr\" (UID: \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\") " pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:34:29 crc kubenswrapper[4747]: I0930 19:34:29.216683 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b7473f3-c7c2-418f-93d9-51b3a98ee670-host\") pod \"crc-debug-bjxcr\" (UID: \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\") " pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:34:29 crc kubenswrapper[4747]: I0930 19:34:29.238637 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9v7h\" (UniqueName: \"kubernetes.io/projected/9b7473f3-c7c2-418f-93d9-51b3a98ee670-kube-api-access-r9v7h\") pod \"crc-debug-bjxcr\" (UID: \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\") " pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:34:29 crc kubenswrapper[4747]: I0930 19:34:29.269977 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:34:29 crc kubenswrapper[4747]: W0930 19:34:29.315515 4747 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b7473f3_c7c2_418f_93d9_51b3a98ee670.slice/crio-1600334372a4262aba216e0df276578878309342818acbabdc55b80d18f8e68e WatchSource:0}: Error finding container 1600334372a4262aba216e0df276578878309342818acbabdc55b80d18f8e68e: Status 404 returned error can't find the container with id 1600334372a4262aba216e0df276578878309342818acbabdc55b80d18f8e68e Sep 30 19:34:29 crc kubenswrapper[4747]: I0930 19:34:29.464769 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" event={"ID":"9b7473f3-c7c2-418f-93d9-51b3a98ee670","Type":"ContainerStarted","Data":"1600334372a4262aba216e0df276578878309342818acbabdc55b80d18f8e68e"} Sep 30 19:34:37 crc kubenswrapper[4747]: I0930 19:34:37.655343 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:34:37 crc kubenswrapper[4747]: I0930 19:34:37.656072 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:34:39 crc kubenswrapper[4747]: I0930 19:34:39.542699 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" event={"ID":"9b7473f3-c7c2-418f-93d9-51b3a98ee670","Type":"ContainerStarted","Data":"f75466424bb669dacb2f42162fb4d81b22981e0734682deaac901a4c95189e8a"} Sep 30 19:34:39 crc kubenswrapper[4747]: I0930 19:34:39.557054 4747 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" podStartSLOduration=1.5308018570000002 podStartE2EDuration="11.557039753s" podCreationTimestamp="2025-09-30 19:34:28 +0000 UTC" firstStartedPulling="2025-09-30 19:34:29.318914313 +0000 UTC m=+2908.978394427" lastFinishedPulling="2025-09-30 19:34:39.345152219 +0000 UTC m=+2919.004632323" observedRunningTime="2025-09-30 19:34:39.556128107 +0000 UTC m=+2919.215608221" watchObservedRunningTime="2025-09-30 19:34:39.557039753 +0000 UTC m=+2919.216519857" Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.655692 4747 patch_prober.go:28] interesting pod/machine-config-daemon-pkmxs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.656330 4747 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.656378 4747 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.657081 4747 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc"} pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.657129 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerName="machine-config-daemon" containerID="cri-o://285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" gracePeriod=600 Sep 30 19:35:07 crc kubenswrapper[4747]: E0930 19:35:07.784248 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.868854 4747 generic.go:334] "Generic (PLEG): container finished" podID="a3fce119-955f-405b-bfb3-96aa4b34aef7" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" exitCode=0 Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.868906 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerDied","Data":"285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc"} Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.869182 4747 scope.go:117] "RemoveContainer" containerID="649c3d2d5401711a3ee99e315af9ba5a91ac1461a0e9179d23cb705857ad9a4e" Sep 30 19:35:07 crc kubenswrapper[4747]: I0930 19:35:07.869763 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:35:07 crc kubenswrapper[4747]: E0930 19:35:07.870005 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:35:21 crc kubenswrapper[4747]: I0930 19:35:21.098202 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:35:21 crc kubenswrapper[4747]: E0930 19:35:21.098959 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:35:22 crc kubenswrapper[4747]: I0930 19:35:22.947729 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_13be932b-9552-4483-a16e-30c0564032b3/cinder-api/0.log" Sep 30 19:35:22 crc kubenswrapper[4747]: I0930 19:35:22.970834 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_13be932b-9552-4483-a16e-30c0564032b3/cinder-api-log/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.171198 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_24a9ac8d-d5be-42b1-95f3-9677a6f12434/probe/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.177694 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_24a9ac8d-d5be-42b1-95f3-9677a6f12434/cinder-scheduler/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.364426 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bd94c88c7-zk8hh_7207174e-ce90-4450-8dac-9d434a26d7ae/init/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.512635 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bd94c88c7-zk8hh_7207174e-ce90-4450-8dac-9d434a26d7ae/init/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.536546 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6bd94c88c7-zk8hh_7207174e-ce90-4450-8dac-9d434a26d7ae/dnsmasq-dns/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.676400 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a6393639-cf22-49a1-96e1-f20d11a72791/glance-httpd/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.723790 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a6393639-cf22-49a1-96e1-f20d11a72791/glance-log/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.877635 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_10f77684-656c-4043-b9b4-8e6bfdca1621/glance-httpd/0.log" Sep 30 19:35:23 crc kubenswrapper[4747]: I0930 19:35:23.904990 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_10f77684-656c-4043-b9b4-8e6bfdca1621/glance-log/0.log" Sep 30 19:35:24 crc kubenswrapper[4747]: I0930 19:35:24.092383 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-648477b6b5-24gx7_333eedad-3727-4187-b392-e4c4d71bc2d1/keystone-api/0.log" Sep 30 19:35:24 crc kubenswrapper[4747]: I0930 19:35:24.328197 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5677cbcc7-67jtn_89f566eb-95f6-490d-a9be-7995bc1dbd4f/neutron-api/0.log" Sep 30 19:35:24 crc kubenswrapper[4747]: I0930 19:35:24.484312 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5677cbcc7-67jtn_89f566eb-95f6-490d-a9be-7995bc1dbd4f/neutron-httpd/0.log" Sep 30 19:35:24 crc kubenswrapper[4747]: I0930 19:35:24.896155 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6dbab27f-1fd5-4576-a5e7-41ec04946217/nova-api-log/0.log" Sep 30 19:35:24 crc kubenswrapper[4747]: I0930 19:35:24.937507 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6dbab27f-1fd5-4576-a5e7-41ec04946217/nova-api-api/0.log" Sep 30 19:35:25 crc kubenswrapper[4747]: I0930 19:35:25.286455 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_26b23eeb-041a-473c-8748-d54b077c81f3/nova-cell0-conductor-conductor/0.log" Sep 30 19:35:25 crc kubenswrapper[4747]: I0930 19:35:25.649624 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7296927d-605d-4885-a57a-d489d59a2bb6/nova-cell1-conductor-conductor/0.log" Sep 30 19:35:25 crc kubenswrapper[4747]: I0930 19:35:25.956479 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ac0b1b32-053b-4384-81a6-d02aa6a15e65/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 19:35:26 crc kubenswrapper[4747]: I0930 19:35:26.221943 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_800e3b38-20c6-4605-98b5-67b9a1ef21f2/nova-metadata-log/0.log" Sep 30 19:35:26 crc kubenswrapper[4747]: I0930 19:35:26.549661 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fc984869-b7e5-4d68-9d09-8bf0f1815e2c/nova-scheduler-scheduler/0.log" Sep 30 19:35:26 crc kubenswrapper[4747]: I0930 19:35:26.644401 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2b7e2ab9-2510-47d9-b1c6-3562b1c968be/mysql-bootstrap/0.log" Sep 30 19:35:26 crc kubenswrapper[4747]: I0930 19:35:26.838563 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2b7e2ab9-2510-47d9-b1c6-3562b1c968be/mysql-bootstrap/0.log" Sep 30 19:35:26 crc kubenswrapper[4747]: I0930 19:35:26.904323 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2b7e2ab9-2510-47d9-b1c6-3562b1c968be/galera/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.107750 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_800e3b38-20c6-4605-98b5-67b9a1ef21f2/nova-metadata-metadata/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.139960 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6ffa91d2-f90c-4b61-be02-28351b9d7d30/mysql-bootstrap/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.294224 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6ffa91d2-f90c-4b61-be02-28351b9d7d30/mysql-bootstrap/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.325024 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6ffa91d2-f90c-4b61-be02-28351b9d7d30/galera/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.498728 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a3ad8842-6027-4f43-b6bf-82096e3c90a3/openstackclient/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.621363 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ljrn6_03e4d2ba-8585-4342-8f1b-4ef78d65911b/openstack-network-exporter/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.825282 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9nf99_24e0750a-cba0-4fc3-8ff5-9ed716525dee/ovsdb-server-init/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.849699 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cbe40b05-41bb-4a7a-9d8d-24d9117d8bc6/memcached/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.974168 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9nf99_24e0750a-cba0-4fc3-8ff5-9ed716525dee/ovsdb-server/0.log" Sep 30 19:35:27 crc kubenswrapper[4747]: I0930 19:35:27.997095 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9nf99_24e0750a-cba0-4fc3-8ff5-9ed716525dee/ovsdb-server-init/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.014169 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9nf99_24e0750a-cba0-4fc3-8ff5-9ed716525dee/ovs-vswitchd/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.146368 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xgjfm_7211e36c-ce8d-434e-9952-e5f5eb7097ec/ovn-controller/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.172648 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58/openstack-network-exporter/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.250017 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac6bc3de-50fd-4d01-8fbc-60f07fdfdd58/ovn-northd/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.337356 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cbbbaf74-0d3f-45fb-8b60-e6edb738ebab/openstack-network-exporter/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.432737 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cbbbaf74-0d3f-45fb-8b60-e6edb738ebab/ovsdbserver-nb/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.483847 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6c1ea28e-2148-476c-9493-8e4be1d4dfa1/openstack-network-exporter/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.515057 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6c1ea28e-2148-476c-9493-8e4be1d4dfa1/ovsdbserver-sb/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.674389 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66b578897b-btxxw_ea495f4c-cb12-4b75-850e-ef1d64d1f9af/placement-api/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.705386 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66b578897b-btxxw_ea495f4c-cb12-4b75-850e-ef1d64d1f9af/placement-log/0.log" Sep 30 19:35:28 crc kubenswrapper[4747]: I0930 19:35:28.830525 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2ae06b6d-5f75-46a4-8805-0d99f3771c71/setup-container/0.log" Sep 30 19:35:29 crc kubenswrapper[4747]: I0930 19:35:29.027444 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2ae06b6d-5f75-46a4-8805-0d99f3771c71/setup-container/0.log" Sep 30 19:35:29 crc kubenswrapper[4747]: I0930 19:35:29.060390 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2ae06b6d-5f75-46a4-8805-0d99f3771c71/rabbitmq/0.log" Sep 30 19:35:29 crc kubenswrapper[4747]: I0930 19:35:29.092167 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_008f0030-d04c-427c-bc09-76874ee17b16/setup-container/0.log" Sep 30 19:35:29 crc kubenswrapper[4747]: I0930 19:35:29.285326 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_008f0030-d04c-427c-bc09-76874ee17b16/setup-container/0.log" Sep 30 19:35:29 crc kubenswrapper[4747]: I0930 19:35:29.333303 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_008f0030-d04c-427c-bc09-76874ee17b16/rabbitmq/0.log" Sep 30 19:35:34 crc kubenswrapper[4747]: I0930 19:35:34.086851 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:35:34 crc kubenswrapper[4747]: E0930 19:35:34.087597 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:35:46 crc kubenswrapper[4747]: I0930 19:35:46.088151 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:35:46 crc kubenswrapper[4747]: E0930 19:35:46.089303 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:35:59 crc kubenswrapper[4747]: I0930 19:35:59.087604 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:35:59 crc kubenswrapper[4747]: E0930 19:35:59.088595 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:35:59 crc kubenswrapper[4747]: I0930 19:35:59.365182 4747 generic.go:334] "Generic (PLEG): container finished" podID="9b7473f3-c7c2-418f-93d9-51b3a98ee670" containerID="f75466424bb669dacb2f42162fb4d81b22981e0734682deaac901a4c95189e8a" exitCode=0 Sep 30 19:35:59 crc kubenswrapper[4747]: I0930 19:35:59.365313 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" event={"ID":"9b7473f3-c7c2-418f-93d9-51b3a98ee670","Type":"ContainerDied","Data":"f75466424bb669dacb2f42162fb4d81b22981e0734682deaac901a4c95189e8a"} Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.471040 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.515626 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-bjxcr"] Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.525964 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-bjxcr"] Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.591469 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b7473f3-c7c2-418f-93d9-51b3a98ee670-host\") pod \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\" (UID: \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\") " Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.591584 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9v7h\" (UniqueName: \"kubernetes.io/projected/9b7473f3-c7c2-418f-93d9-51b3a98ee670-kube-api-access-r9v7h\") pod \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\" (UID: \"9b7473f3-c7c2-418f-93d9-51b3a98ee670\") " Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.591660 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b7473f3-c7c2-418f-93d9-51b3a98ee670-host" (OuterVolumeSpecName: "host") pod "9b7473f3-c7c2-418f-93d9-51b3a98ee670" (UID: "9b7473f3-c7c2-418f-93d9-51b3a98ee670"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.592241 4747 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b7473f3-c7c2-418f-93d9-51b3a98ee670-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.601478 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7473f3-c7c2-418f-93d9-51b3a98ee670-kube-api-access-r9v7h" (OuterVolumeSpecName: "kube-api-access-r9v7h") pod "9b7473f3-c7c2-418f-93d9-51b3a98ee670" (UID: "9b7473f3-c7c2-418f-93d9-51b3a98ee670"). InnerVolumeSpecName "kube-api-access-r9v7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:00 crc kubenswrapper[4747]: I0930 19:36:00.692878 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9v7h\" (UniqueName: \"kubernetes.io/projected/9b7473f3-c7c2-418f-93d9-51b3a98ee670-kube-api-access-r9v7h\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.111084 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7473f3-c7c2-418f-93d9-51b3a98ee670" path="/var/lib/kubelet/pods/9b7473f3-c7c2-418f-93d9-51b3a98ee670/volumes" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.386333 4747 scope.go:117] "RemoveContainer" containerID="f75466424bb669dacb2f42162fb4d81b22981e0734682deaac901a4c95189e8a" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.386395 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-bjxcr" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.720349 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-98lf4"] Sep 30 19:36:01 crc kubenswrapper[4747]: E0930 19:36:01.721190 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7473f3-c7c2-418f-93d9-51b3a98ee670" containerName="container-00" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.721206 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7473f3-c7c2-418f-93d9-51b3a98ee670" containerName="container-00" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.721389 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7473f3-c7c2-418f-93d9-51b3a98ee670" containerName="container-00" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.722020 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.915260 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn8b7\" (UniqueName: \"kubernetes.io/projected/a7127d38-940e-4e7e-a7e3-47d18cf467a7-kube-api-access-mn8b7\") pod \"crc-debug-98lf4\" (UID: \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\") " pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:01 crc kubenswrapper[4747]: I0930 19:36:01.915745 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7127d38-940e-4e7e-a7e3-47d18cf467a7-host\") pod \"crc-debug-98lf4\" (UID: \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\") " pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:02 crc kubenswrapper[4747]: I0930 19:36:02.017709 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7127d38-940e-4e7e-a7e3-47d18cf467a7-host\") pod \"crc-debug-98lf4\" (UID: \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\") " pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:02 crc kubenswrapper[4747]: I0930 19:36:02.017865 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7127d38-940e-4e7e-a7e3-47d18cf467a7-host\") pod \"crc-debug-98lf4\" (UID: \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\") " pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:02 crc kubenswrapper[4747]: I0930 19:36:02.018579 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn8b7\" (UniqueName: \"kubernetes.io/projected/a7127d38-940e-4e7e-a7e3-47d18cf467a7-kube-api-access-mn8b7\") pod \"crc-debug-98lf4\" (UID: \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\") " pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:02 crc kubenswrapper[4747]: I0930 19:36:02.053970 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn8b7\" (UniqueName: \"kubernetes.io/projected/a7127d38-940e-4e7e-a7e3-47d18cf467a7-kube-api-access-mn8b7\") pod \"crc-debug-98lf4\" (UID: \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\") " pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:02 crc kubenswrapper[4747]: I0930 19:36:02.346364 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:03 crc kubenswrapper[4747]: I0930 19:36:03.419086 4747 generic.go:334] "Generic (PLEG): container finished" podID="a7127d38-940e-4e7e-a7e3-47d18cf467a7" containerID="8e6f128669ad957d2bf3aa4f11e7dcf61285efef1b83ce23cc0842e437eb7f93" exitCode=0 Sep 30 19:36:03 crc kubenswrapper[4747]: I0930 19:36:03.419221 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/crc-debug-98lf4" event={"ID":"a7127d38-940e-4e7e-a7e3-47d18cf467a7","Type":"ContainerDied","Data":"8e6f128669ad957d2bf3aa4f11e7dcf61285efef1b83ce23cc0842e437eb7f93"} Sep 30 19:36:03 crc kubenswrapper[4747]: I0930 19:36:03.419744 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/crc-debug-98lf4" event={"ID":"a7127d38-940e-4e7e-a7e3-47d18cf467a7","Type":"ContainerStarted","Data":"adfec9d8e2b3bb4c26003d78cd2a2719c21eb421882cb0d93b774e5dc1204186"} Sep 30 19:36:04 crc kubenswrapper[4747]: I0930 19:36:04.498832 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:04 crc kubenswrapper[4747]: I0930 19:36:04.565745 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn8b7\" (UniqueName: \"kubernetes.io/projected/a7127d38-940e-4e7e-a7e3-47d18cf467a7-kube-api-access-mn8b7\") pod \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\" (UID: \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\") " Sep 30 19:36:04 crc kubenswrapper[4747]: I0930 19:36:04.566157 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7127d38-940e-4e7e-a7e3-47d18cf467a7-host\") pod \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\" (UID: \"a7127d38-940e-4e7e-a7e3-47d18cf467a7\") " Sep 30 19:36:04 crc kubenswrapper[4747]: I0930 19:36:04.566295 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7127d38-940e-4e7e-a7e3-47d18cf467a7-host" (OuterVolumeSpecName: "host") pod "a7127d38-940e-4e7e-a7e3-47d18cf467a7" (UID: "a7127d38-940e-4e7e-a7e3-47d18cf467a7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:36:04 crc kubenswrapper[4747]: I0930 19:36:04.566549 4747 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7127d38-940e-4e7e-a7e3-47d18cf467a7-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:04 crc kubenswrapper[4747]: I0930 19:36:04.582184 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7127d38-940e-4e7e-a7e3-47d18cf467a7-kube-api-access-mn8b7" (OuterVolumeSpecName: "kube-api-access-mn8b7") pod "a7127d38-940e-4e7e-a7e3-47d18cf467a7" (UID: "a7127d38-940e-4e7e-a7e3-47d18cf467a7"). InnerVolumeSpecName "kube-api-access-mn8b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:04 crc kubenswrapper[4747]: I0930 19:36:04.667644 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn8b7\" (UniqueName: \"kubernetes.io/projected/a7127d38-940e-4e7e-a7e3-47d18cf467a7-kube-api-access-mn8b7\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:05 crc kubenswrapper[4747]: I0930 19:36:05.435446 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/crc-debug-98lf4" event={"ID":"a7127d38-940e-4e7e-a7e3-47d18cf467a7","Type":"ContainerDied","Data":"adfec9d8e2b3bb4c26003d78cd2a2719c21eb421882cb0d93b774e5dc1204186"} Sep 30 19:36:05 crc kubenswrapper[4747]: I0930 19:36:05.435485 4747 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adfec9d8e2b3bb4c26003d78cd2a2719c21eb421882cb0d93b774e5dc1204186" Sep 30 19:36:05 crc kubenswrapper[4747]: I0930 19:36:05.435535 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-98lf4" Sep 30 19:36:08 crc kubenswrapper[4747]: I0930 19:36:08.323122 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-98lf4"] Sep 30 19:36:08 crc kubenswrapper[4747]: I0930 19:36:08.335083 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-98lf4"] Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.107056 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7127d38-940e-4e7e-a7e3-47d18cf467a7" path="/var/lib/kubelet/pods/a7127d38-940e-4e7e-a7e3-47d18cf467a7/volumes" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.533788 4747 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-54zs4"] Sep 30 19:36:09 crc kubenswrapper[4747]: E0930 19:36:09.534685 4747 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7127d38-940e-4e7e-a7e3-47d18cf467a7" containerName="container-00" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.534706 4747 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7127d38-940e-4e7e-a7e3-47d18cf467a7" containerName="container-00" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.535040 4747 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7127d38-940e-4e7e-a7e3-47d18cf467a7" containerName="container-00" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.536061 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.643018 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl2t7\" (UniqueName: \"kubernetes.io/projected/2891011f-c523-4b1c-94cd-f704562bcf97-kube-api-access-zl2t7\") pod \"crc-debug-54zs4\" (UID: \"2891011f-c523-4b1c-94cd-f704562bcf97\") " pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.643431 4747 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2891011f-c523-4b1c-94cd-f704562bcf97-host\") pod \"crc-debug-54zs4\" (UID: \"2891011f-c523-4b1c-94cd-f704562bcf97\") " pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.745849 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl2t7\" (UniqueName: \"kubernetes.io/projected/2891011f-c523-4b1c-94cd-f704562bcf97-kube-api-access-zl2t7\") pod \"crc-debug-54zs4\" (UID: \"2891011f-c523-4b1c-94cd-f704562bcf97\") " pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.746042 4747 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2891011f-c523-4b1c-94cd-f704562bcf97-host\") pod \"crc-debug-54zs4\" (UID: \"2891011f-c523-4b1c-94cd-f704562bcf97\") " pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.746277 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2891011f-c523-4b1c-94cd-f704562bcf97-host\") pod \"crc-debug-54zs4\" (UID: \"2891011f-c523-4b1c-94cd-f704562bcf97\") " pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.779773 4747 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl2t7\" (UniqueName: \"kubernetes.io/projected/2891011f-c523-4b1c-94cd-f704562bcf97-kube-api-access-zl2t7\") pod \"crc-debug-54zs4\" (UID: \"2891011f-c523-4b1c-94cd-f704562bcf97\") " pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:09 crc kubenswrapper[4747]: I0930 19:36:09.863776 4747 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:10 crc kubenswrapper[4747]: I0930 19:36:10.482215 4747 generic.go:334] "Generic (PLEG): container finished" podID="2891011f-c523-4b1c-94cd-f704562bcf97" containerID="bbe15e572b8819319cf4497388f9dd0e41d92e1422e2f63f11aa052b17ffbdfa" exitCode=0 Sep 30 19:36:10 crc kubenswrapper[4747]: I0930 19:36:10.482289 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/crc-debug-54zs4" event={"ID":"2891011f-c523-4b1c-94cd-f704562bcf97","Type":"ContainerDied","Data":"bbe15e572b8819319cf4497388f9dd0e41d92e1422e2f63f11aa052b17ffbdfa"} Sep 30 19:36:10 crc kubenswrapper[4747]: I0930 19:36:10.482779 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/crc-debug-54zs4" event={"ID":"2891011f-c523-4b1c-94cd-f704562bcf97","Type":"ContainerStarted","Data":"104d9bb2cf1d670bd50f2684a319dd678f99e24630e11c5b16e8ee2473ca380f"} Sep 30 19:36:10 crc kubenswrapper[4747]: I0930 19:36:10.547355 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-54zs4"] Sep 30 19:36:10 crc kubenswrapper[4747]: I0930 19:36:10.558957 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kgq9j/crc-debug-54zs4"] Sep 30 19:36:11 crc kubenswrapper[4747]: I0930 19:36:11.617651 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:11 crc kubenswrapper[4747]: I0930 19:36:11.786492 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl2t7\" (UniqueName: \"kubernetes.io/projected/2891011f-c523-4b1c-94cd-f704562bcf97-kube-api-access-zl2t7\") pod \"2891011f-c523-4b1c-94cd-f704562bcf97\" (UID: \"2891011f-c523-4b1c-94cd-f704562bcf97\") " Sep 30 19:36:11 crc kubenswrapper[4747]: I0930 19:36:11.786522 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2891011f-c523-4b1c-94cd-f704562bcf97-host\") pod \"2891011f-c523-4b1c-94cd-f704562bcf97\" (UID: \"2891011f-c523-4b1c-94cd-f704562bcf97\") " Sep 30 19:36:11 crc kubenswrapper[4747]: I0930 19:36:11.786958 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2891011f-c523-4b1c-94cd-f704562bcf97-host" (OuterVolumeSpecName: "host") pod "2891011f-c523-4b1c-94cd-f704562bcf97" (UID: "2891011f-c523-4b1c-94cd-f704562bcf97"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:36:11 crc kubenswrapper[4747]: I0930 19:36:11.792318 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2891011f-c523-4b1c-94cd-f704562bcf97-kube-api-access-zl2t7" (OuterVolumeSpecName: "kube-api-access-zl2t7") pod "2891011f-c523-4b1c-94cd-f704562bcf97" (UID: "2891011f-c523-4b1c-94cd-f704562bcf97"). InnerVolumeSpecName "kube-api-access-zl2t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:11 crc kubenswrapper[4747]: I0930 19:36:11.888640 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl2t7\" (UniqueName: \"kubernetes.io/projected/2891011f-c523-4b1c-94cd-f704562bcf97-kube-api-access-zl2t7\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:11 crc kubenswrapper[4747]: I0930 19:36:11.888667 4747 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2891011f-c523-4b1c-94cd-f704562bcf97-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.097253 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd_81844e1b-ba6d-4561-8997-4b784d92c5da/util/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.229546 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd_81844e1b-ba6d-4561-8997-4b784d92c5da/util/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.254629 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd_81844e1b-ba6d-4561-8997-4b784d92c5da/pull/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.262648 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd_81844e1b-ba6d-4561-8997-4b784d92c5da/pull/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.440846 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd_81844e1b-ba6d-4561-8997-4b784d92c5da/util/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.441092 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd_81844e1b-ba6d-4561-8997-4b784d92c5da/pull/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.482640 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0ekrfsd_81844e1b-ba6d-4561-8997-4b784d92c5da/extract/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.500690 4747 scope.go:117] "RemoveContainer" containerID="bbe15e572b8819319cf4497388f9dd0e41d92e1422e2f63f11aa052b17ffbdfa" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.500746 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/crc-debug-54zs4" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.652888 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lrcwn_b99836bf-ec49-4751-a4ab-0656ad583daf/kube-rbac-proxy/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.678602 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-g6kv2_0e3fd103-b88a-4289-a0f3-959c5d1de5d3/kube-rbac-proxy/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.683871 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-g6kv2_0e3fd103-b88a-4289-a0f3-959c5d1de5d3/manager/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.870625 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lrcwn_b99836bf-ec49-4751-a4ab-0656ad583daf/manager/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.880438 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-4x7bb_9b526a9f-9938-4391-903c-6c5a861256d3/kube-rbac-proxy/0.log" Sep 30 19:36:12 crc kubenswrapper[4747]: I0930 19:36:12.929202 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-4x7bb_9b526a9f-9938-4391-903c-6c5a861256d3/manager/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.096539 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2891011f-c523-4b1c-94cd-f704562bcf97" path="/var/lib/kubelet/pods/2891011f-c523-4b1c-94cd-f704562bcf97/volumes" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.120868 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-pzbbf_dc5a3b8a-b0b5-40d5-a3b8-806105406c70/kube-rbac-proxy/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.144238 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-pzbbf_dc5a3b8a-b0b5-40d5-a3b8-806105406c70/manager/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.203678 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-62mgn_8d0a9431-6171-433c-a100-8d36b93e3422/kube-rbac-proxy/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.307054 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-62mgn_8d0a9431-6171-433c-a100-8d36b93e3422/manager/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.317628 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-x5qln_ad81d435-bedc-463a-a8c6-346ab3b5ee5a/kube-rbac-proxy/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.371625 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-x5qln_ad81d435-bedc-463a-a8c6-346ab3b5ee5a/manager/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.480551 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-sfgn2_eeb2c9bd-6467-4105-a4b9-800c69b815d0/kube-rbac-proxy/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.604953 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-sfgn2_eeb2c9bd-6467-4105-a4b9-800c69b815d0/manager/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.650899 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-np9sf_48f86fea-23fe-488b-8429-7e97c676675b/kube-rbac-proxy/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.679501 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-np9sf_48f86fea-23fe-488b-8429-7e97c676675b/manager/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.784530 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-9j6gj_f925f84a-6e3d-45be-82cf-320f7654adb0/kube-rbac-proxy/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.872859 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-9j6gj_f925f84a-6e3d-45be-82cf-320f7654adb0/manager/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.952295 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-t4kgw_da47eadd-a352-46c7-84aa-fbc25bfac106/kube-rbac-proxy/0.log" Sep 30 19:36:13 crc kubenswrapper[4747]: I0930 19:36:13.954445 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-t4kgw_da47eadd-a352-46c7-84aa-fbc25bfac106/manager/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.087534 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:36:14 crc kubenswrapper[4747]: E0930 19:36:14.088101 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.093850 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-vkxhv_4159cab8-0891-49ef-9293-425697a48dcd/kube-rbac-proxy/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.141320 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-vkxhv_4159cab8-0891-49ef-9293-425697a48dcd/manager/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.237655 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-xczb4_2ac6e9ef-b559-4a14-861a-84c9a2308e06/kube-rbac-proxy/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.301468 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-xczb4_2ac6e9ef-b559-4a14-861a-84c9a2308e06/manager/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.369204 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-lxkkz_f508c857-c282-4b4e-9d93-e7358f3aad9d/kube-rbac-proxy/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.495647 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-lxkkz_f508c857-c282-4b4e-9d93-e7358f3aad9d/manager/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.535335 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-jq9pq_f9931c40-80f2-4467-a708-0b2fa69e05e0/manager/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.581618 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-jq9pq_f9931c40-80f2-4467-a708-0b2fa69e05e0/kube-rbac-proxy/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.684610 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq_72272c30-dd3c-4779-bea8-e65f9e5ecd06/kube-rbac-proxy/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.696117 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55f4778d9fcvfvq_72272c30-dd3c-4779-bea8-e65f9e5ecd06/manager/0.log" Sep 30 19:36:14 crc kubenswrapper[4747]: I0930 19:36:14.877468 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5468b64689-vzw94_eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0/kube-rbac-proxy/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.028701 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-d8fdfd448-nhb7f_8ae757f1-995e-4b91-ac4f-20962f2a7072/kube-rbac-proxy/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.130146 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-drgjp_ea91f08a-965d-42d3-bad0-62e39f0c442f/registry-server/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.203371 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-d8fdfd448-nhb7f_8ae757f1-995e-4b91-ac4f-20962f2a7072/operator/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.302973 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-xnttg_b521c8df-d702-482e-b6cc-d04577dc0936/kube-rbac-proxy/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.391060 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-xnttg_b521c8df-d702-482e-b6cc-d04577dc0936/manager/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.489524 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-jm22c_d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a/manager/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.494461 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-jm22c_d4ea0624-40a2-40c1-b8b5-1b3d64b1e52a/kube-rbac-proxy/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.510215 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5468b64689-vzw94_eb4f1d1d-6e6b-4bed-9a4e-d8630cb432e0/manager/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.646375 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-f528p_6a934ea5-7f7f-4274-aac9-135961df32a6/operator/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.655989 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-mqc9x_ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92/kube-rbac-proxy/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.690555 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-mqc9x_ab2b2a41-52ce-4bdd-93c0-d03dc2b98f92/manager/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.815025 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7bdb6cfb74-8jxt9_9942a15a-9f22-465e-8c11-ce97e741f65f/manager/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.824037 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7bdb6cfb74-8jxt9_9942a15a-9f22-465e-8c11-ce97e741f65f/kube-rbac-proxy/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.886896 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-pcbjx_f53a4663-01e9-42d3-8dc5-f708734dfe6c/kube-rbac-proxy/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.935432 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-pcbjx_f53a4663-01e9-42d3-8dc5-f708734dfe6c/manager/0.log" Sep 30 19:36:15 crc kubenswrapper[4747]: I0930 19:36:15.995724 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-2rskh_a60ae9a4-d39a-466a-9b2a-6122270c4614/kube-rbac-proxy/0.log" Sep 30 19:36:16 crc kubenswrapper[4747]: I0930 19:36:16.034388 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-2rskh_a60ae9a4-d39a-466a-9b2a-6122270c4614/manager/0.log" Sep 30 19:36:29 crc kubenswrapper[4747]: I0930 19:36:29.087871 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:36:29 crc kubenswrapper[4747]: E0930 19:36:29.088971 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:36:32 crc kubenswrapper[4747]: I0930 19:36:32.671140 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-sq5x4_d526e7bd-199d-4d9f-8826-6eee8fc0fa8d/control-plane-machine-set-operator/0.log" Sep 30 19:36:32 crc kubenswrapper[4747]: I0930 19:36:32.871333 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7n9gf_708b7382-ffc3-42e3-ac45-e1776b18473e/kube-rbac-proxy/0.log" Sep 30 19:36:32 crc kubenswrapper[4747]: I0930 19:36:32.906401 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7n9gf_708b7382-ffc3-42e3-ac45-e1776b18473e/machine-api-operator/0.log" Sep 30 19:36:44 crc kubenswrapper[4747]: I0930 19:36:44.087784 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:36:44 crc kubenswrapper[4747]: E0930 19:36:44.088954 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:36:46 crc kubenswrapper[4747]: I0930 19:36:46.125299 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-q2nml_c8b2ec10-81fd-46e8-b73c-b8141264574c/cert-manager-controller/0.log" Sep 30 19:36:46 crc kubenswrapper[4747]: I0930 19:36:46.339766 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-4slkd_7469fdd1-8820-4af1-87ee-d4bd00a8f211/cert-manager-cainjector/0.log" Sep 30 19:36:46 crc kubenswrapper[4747]: I0930 19:36:46.426635 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-2qsp9_809dacb9-cd8e-4950-8092-bc3c647db303/cert-manager-webhook/0.log" Sep 30 19:36:58 crc kubenswrapper[4747]: I0930 19:36:58.087846 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:36:58 crc kubenswrapper[4747]: E0930 19:36:58.089065 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:36:59 crc kubenswrapper[4747]: I0930 19:36:59.302272 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-c6cft_e3ab74aa-6201-4f76-b93a-04339dca4de7/nmstate-console-plugin/0.log" Sep 30 19:36:59 crc kubenswrapper[4747]: I0930 19:36:59.458908 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-bh6p4_b5822edc-1696-439a-b703-5dbc2720e0aa/kube-rbac-proxy/0.log" Sep 30 19:36:59 crc kubenswrapper[4747]: I0930 19:36:59.460124 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-r5l5k_11d1c978-5e5f-4957-9963-2194e68f6cd7/nmstate-handler/0.log" Sep 30 19:36:59 crc kubenswrapper[4747]: I0930 19:36:59.562609 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-bh6p4_b5822edc-1696-439a-b703-5dbc2720e0aa/nmstate-metrics/0.log" Sep 30 19:36:59 crc kubenswrapper[4747]: I0930 19:36:59.637684 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-m7n96_c6a5113e-62d6-4f8e-9ea4-7c0f9502ae07/nmstate-operator/0.log" Sep 30 19:36:59 crc kubenswrapper[4747]: I0930 19:36:59.742370 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-hr477_5997966c-5225-4adf-a02c-7ed6788335c2/nmstate-webhook/0.log" Sep 30 19:37:10 crc kubenswrapper[4747]: I0930 19:37:10.087103 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:37:10 crc kubenswrapper[4747]: E0930 19:37:10.088153 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.265772 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-kxnbc_78df1507-7948-4a08-a2be-1b1a60cbb9ff/kube-rbac-proxy/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.365341 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-kxnbc_78df1507-7948-4a08-a2be-1b1a60cbb9ff/controller/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.426756 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-frr-files/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.613317 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-reloader/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.642629 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-frr-files/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.647040 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-metrics/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.654813 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-reloader/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.811385 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-reloader/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.853060 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-metrics/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.854966 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-metrics/0.log" Sep 30 19:37:14 crc kubenswrapper[4747]: I0930 19:37:14.861984 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-frr-files/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.026686 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-frr-files/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.048116 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-reloader/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.051522 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/controller/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.052404 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/cp-metrics/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.228836 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/frr-metrics/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.264101 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/kube-rbac-proxy/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.285645 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/kube-rbac-proxy-frr/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.418404 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/reloader/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.505843 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-swnb8_6af1f61d-d5a1-4a3a-94ef-08d731e3b1aa/frr-k8s-webhook-server/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.676014 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-d99c7c5b9-tzfsr_cb80bc02-e6c8-453b-b63b-5a95783c2520/manager/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.875372 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5d95fdcc77-tsv28_e14405c7-e1c8-4713-b14b-58926d71206b/webhook-server/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.968096 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lk65v_f190785a-8ea7-4e42-b803-192a77b4c874/frr/0.log" Sep 30 19:37:15 crc kubenswrapper[4747]: I0930 19:37:15.969288 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hgt8c_f5763869-994d-4a75-a8a9-00bee3414aad/kube-rbac-proxy/0.log" Sep 30 19:37:16 crc kubenswrapper[4747]: I0930 19:37:16.319301 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hgt8c_f5763869-994d-4a75-a8a9-00bee3414aad/speaker/0.log" Sep 30 19:37:21 crc kubenswrapper[4747]: I0930 19:37:21.091440 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:37:21 crc kubenswrapper[4747]: E0930 19:37:21.093500 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:37:29 crc kubenswrapper[4747]: I0930 19:37:29.253685 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk_238ba769-d3c7-4101-931c-f752b3343092/util/0.log" Sep 30 19:37:29 crc kubenswrapper[4747]: I0930 19:37:29.402280 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk_238ba769-d3c7-4101-931c-f752b3343092/util/0.log" Sep 30 19:37:29 crc kubenswrapper[4747]: I0930 19:37:29.454735 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk_238ba769-d3c7-4101-931c-f752b3343092/pull/0.log" Sep 30 19:37:29 crc kubenswrapper[4747]: I0930 19:37:29.502143 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk_238ba769-d3c7-4101-931c-f752b3343092/pull/0.log" Sep 30 19:37:29 crc kubenswrapper[4747]: I0930 19:37:29.631441 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk_238ba769-d3c7-4101-931c-f752b3343092/extract/0.log" Sep 30 19:37:29 crc kubenswrapper[4747]: I0930 19:37:29.655767 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk_238ba769-d3c7-4101-931c-f752b3343092/util/0.log" Sep 30 19:37:29 crc kubenswrapper[4747]: I0930 19:37:29.679531 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l9lrk_238ba769-d3c7-4101-931c-f752b3343092/pull/0.log" Sep 30 19:37:29 crc kubenswrapper[4747]: I0930 19:37:29.831598 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6_923fff81-6af4-4c7f-b77a-2d7f0d4a557c/util/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.022953 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6_923fff81-6af4-4c7f-b77a-2d7f0d4a557c/util/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.024170 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6_923fff81-6af4-4c7f-b77a-2d7f0d4a557c/pull/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.040016 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6_923fff81-6af4-4c7f-b77a-2d7f0d4a557c/pull/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.150502 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6_923fff81-6af4-4c7f-b77a-2d7f0d4a557c/util/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.211841 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6_923fff81-6af4-4c7f-b77a-2d7f0d4a557c/pull/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.216135 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcd7gs6_923fff81-6af4-4c7f-b77a-2d7f0d4a557c/extract/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.363348 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w49q7_e7c5dfa5-42d2-4344-bfb3-bd0781f392c5/extract-utilities/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.479249 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w49q7_e7c5dfa5-42d2-4344-bfb3-bd0781f392c5/extract-utilities/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.521072 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w49q7_e7c5dfa5-42d2-4344-bfb3-bd0781f392c5/extract-content/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.562594 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w49q7_e7c5dfa5-42d2-4344-bfb3-bd0781f392c5/extract-content/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.692078 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w49q7_e7c5dfa5-42d2-4344-bfb3-bd0781f392c5/extract-utilities/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.782960 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w49q7_e7c5dfa5-42d2-4344-bfb3-bd0781f392c5/extract-content/0.log" Sep 30 19:37:30 crc kubenswrapper[4747]: I0930 19:37:30.905780 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9zdv_d4395a0b-1f58-4e40-a352-db7ad2b5e688/extract-utilities/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.137333 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9zdv_d4395a0b-1f58-4e40-a352-db7ad2b5e688/extract-content/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.186271 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9zdv_d4395a0b-1f58-4e40-a352-db7ad2b5e688/extract-content/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.216307 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9zdv_d4395a0b-1f58-4e40-a352-db7ad2b5e688/extract-utilities/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.300035 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w49q7_e7c5dfa5-42d2-4344-bfb3-bd0781f392c5/registry-server/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.375920 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9zdv_d4395a0b-1f58-4e40-a352-db7ad2b5e688/extract-content/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.397615 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9zdv_d4395a0b-1f58-4e40-a352-db7ad2b5e688/extract-utilities/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.604650 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs_5a83c062-1bc6-4c6a-83cf-40064d73606f/util/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.791041 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9zdv_d4395a0b-1f58-4e40-a352-db7ad2b5e688/registry-server/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.798584 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs_5a83c062-1bc6-4c6a-83cf-40064d73606f/pull/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.838883 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs_5a83c062-1bc6-4c6a-83cf-40064d73606f/pull/0.log" Sep 30 19:37:31 crc kubenswrapper[4747]: I0930 19:37:31.842869 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs_5a83c062-1bc6-4c6a-83cf-40064d73606f/util/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.007643 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs_5a83c062-1bc6-4c6a-83cf-40064d73606f/pull/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.009461 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs_5a83c062-1bc6-4c6a-83cf-40064d73606f/util/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.024748 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96xmnhs_5a83c062-1bc6-4c6a-83cf-40064d73606f/extract/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.154985 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wl22v_461defee-db4a-4cf5-bb7b-bffeb4bdf244/marketplace-operator/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.211787 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qj7z8_f33b4813-db02-4f07-9662-58046af264f2/extract-utilities/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.342182 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qj7z8_f33b4813-db02-4f07-9662-58046af264f2/extract-content/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.364051 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qj7z8_f33b4813-db02-4f07-9662-58046af264f2/extract-utilities/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.374873 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qj7z8_f33b4813-db02-4f07-9662-58046af264f2/extract-content/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.508857 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qj7z8_f33b4813-db02-4f07-9662-58046af264f2/extract-content/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.509079 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qj7z8_f33b4813-db02-4f07-9662-58046af264f2/extract-utilities/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.635440 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qj7z8_f33b4813-db02-4f07-9662-58046af264f2/registry-server/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.680137 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bc55_9b4f7959-bc18-4ea1-b257-4a1a1aa17395/extract-utilities/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.823466 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bc55_9b4f7959-bc18-4ea1-b257-4a1a1aa17395/extract-utilities/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.823469 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bc55_9b4f7959-bc18-4ea1-b257-4a1a1aa17395/extract-content/0.log" Sep 30 19:37:32 crc kubenswrapper[4747]: I0930 19:37:32.859033 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bc55_9b4f7959-bc18-4ea1-b257-4a1a1aa17395/extract-content/0.log" Sep 30 19:37:33 crc kubenswrapper[4747]: I0930 19:37:33.011294 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bc55_9b4f7959-bc18-4ea1-b257-4a1a1aa17395/extract-content/0.log" Sep 30 19:37:33 crc kubenswrapper[4747]: I0930 19:37:33.012272 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bc55_9b4f7959-bc18-4ea1-b257-4a1a1aa17395/extract-utilities/0.log" Sep 30 19:37:33 crc kubenswrapper[4747]: I0930 19:37:33.086973 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:37:33 crc kubenswrapper[4747]: E0930 19:37:33.087422 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:37:33 crc kubenswrapper[4747]: I0930 19:37:33.483496 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9bc55_9b4f7959-bc18-4ea1-b257-4a1a1aa17395/registry-server/0.log" Sep 30 19:37:44 crc kubenswrapper[4747]: I0930 19:37:44.087082 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:37:44 crc kubenswrapper[4747]: E0930 19:37:44.087836 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:37:57 crc kubenswrapper[4747]: I0930 19:37:57.087217 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:37:57 crc kubenswrapper[4747]: E0930 19:37:57.087946 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:38:12 crc kubenswrapper[4747]: I0930 19:38:12.088334 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:38:12 crc kubenswrapper[4747]: E0930 19:38:12.089853 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:38:26 crc kubenswrapper[4747]: I0930 19:38:26.088123 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:38:26 crc kubenswrapper[4747]: E0930 19:38:26.089206 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:38:39 crc kubenswrapper[4747]: I0930 19:38:39.088076 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:38:39 crc kubenswrapper[4747]: E0930 19:38:39.089584 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:38:53 crc kubenswrapper[4747]: I0930 19:38:53.087863 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:38:53 crc kubenswrapper[4747]: E0930 19:38:53.088969 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:39:08 crc kubenswrapper[4747]: I0930 19:39:08.087285 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:39:08 crc kubenswrapper[4747]: E0930 19:39:08.088075 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:39:09 crc kubenswrapper[4747]: I0930 19:39:09.232623 4747 generic.go:334] "Generic (PLEG): container finished" podID="33174d61-4dee-47d5-b4b9-bf03d6e2deeb" containerID="030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad" exitCode=0 Sep 30 19:39:09 crc kubenswrapper[4747]: I0930 19:39:09.232738 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kgq9j/must-gather-89th2" event={"ID":"33174d61-4dee-47d5-b4b9-bf03d6e2deeb","Type":"ContainerDied","Data":"030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad"} Sep 30 19:39:09 crc kubenswrapper[4747]: I0930 19:39:09.235434 4747 scope.go:117] "RemoveContainer" containerID="030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad" Sep 30 19:39:09 crc kubenswrapper[4747]: I0930 19:39:09.610070 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kgq9j_must-gather-89th2_33174d61-4dee-47d5-b4b9-bf03d6e2deeb/gather/0.log" Sep 30 19:39:17 crc kubenswrapper[4747]: I0930 19:39:17.453769 4747 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kgq9j/must-gather-89th2"] Sep 30 19:39:17 crc kubenswrapper[4747]: I0930 19:39:17.454919 4747 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kgq9j/must-gather-89th2" podUID="33174d61-4dee-47d5-b4b9-bf03d6e2deeb" containerName="copy" containerID="cri-o://65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36" gracePeriod=2 Sep 30 19:39:17 crc kubenswrapper[4747]: I0930 19:39:17.463023 4747 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kgq9j/must-gather-89th2"] Sep 30 19:39:17 crc kubenswrapper[4747]: I0930 19:39:17.876601 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kgq9j_must-gather-89th2_33174d61-4dee-47d5-b4b9-bf03d6e2deeb/copy/0.log" Sep 30 19:39:17 crc kubenswrapper[4747]: I0930 19:39:17.877532 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:39:17 crc kubenswrapper[4747]: I0930 19:39:17.977520 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-must-gather-output\") pod \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\" (UID: \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\") " Sep 30 19:39:17 crc kubenswrapper[4747]: I0930 19:39:17.977580 4747 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgmfh\" (UniqueName: \"kubernetes.io/projected/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-kube-api-access-fgmfh\") pod \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\" (UID: \"33174d61-4dee-47d5-b4b9-bf03d6e2deeb\") " Sep 30 19:39:17 crc kubenswrapper[4747]: I0930 19:39:17.986004 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-kube-api-access-fgmfh" (OuterVolumeSpecName: "kube-api-access-fgmfh") pod "33174d61-4dee-47d5-b4b9-bf03d6e2deeb" (UID: "33174d61-4dee-47d5-b4b9-bf03d6e2deeb"). InnerVolumeSpecName "kube-api-access-fgmfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.080457 4747 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgmfh\" (UniqueName: \"kubernetes.io/projected/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-kube-api-access-fgmfh\") on node \"crc\" DevicePath \"\"" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.087051 4747 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "33174d61-4dee-47d5-b4b9-bf03d6e2deeb" (UID: "33174d61-4dee-47d5-b4b9-bf03d6e2deeb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.182086 4747 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33174d61-4dee-47d5-b4b9-bf03d6e2deeb-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.337707 4747 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kgq9j_must-gather-89th2_33174d61-4dee-47d5-b4b9-bf03d6e2deeb/copy/0.log" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.338162 4747 generic.go:334] "Generic (PLEG): container finished" podID="33174d61-4dee-47d5-b4b9-bf03d6e2deeb" containerID="65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36" exitCode=143 Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.338240 4747 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kgq9j/must-gather-89th2" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.338284 4747 scope.go:117] "RemoveContainer" containerID="65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.373540 4747 scope.go:117] "RemoveContainer" containerID="030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.454328 4747 scope.go:117] "RemoveContainer" containerID="65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36" Sep 30 19:39:18 crc kubenswrapper[4747]: E0930 19:39:18.454674 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36\": container with ID starting with 65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36 not found: ID does not exist" containerID="65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.454705 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36"} err="failed to get container status \"65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36\": rpc error: code = NotFound desc = could not find container \"65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36\": container with ID starting with 65c61ed004cec0f6becc83ea39630971381afd4e411a209fcc35cea9ef524a36 not found: ID does not exist" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.454725 4747 scope.go:117] "RemoveContainer" containerID="030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad" Sep 30 19:39:18 crc kubenswrapper[4747]: E0930 19:39:18.455096 4747 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad\": container with ID starting with 030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad not found: ID does not exist" containerID="030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad" Sep 30 19:39:18 crc kubenswrapper[4747]: I0930 19:39:18.455116 4747 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad"} err="failed to get container status \"030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad\": rpc error: code = NotFound desc = could not find container \"030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad\": container with ID starting with 030a81944d9a5443da20d278aeade3e9cd9fa066821539d3b4787bd9704749ad not found: ID does not exist" Sep 30 19:39:19 crc kubenswrapper[4747]: I0930 19:39:19.105701 4747 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33174d61-4dee-47d5-b4b9-bf03d6e2deeb" path="/var/lib/kubelet/pods/33174d61-4dee-47d5-b4b9-bf03d6e2deeb/volumes" Sep 30 19:39:23 crc kubenswrapper[4747]: I0930 19:39:23.086751 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:39:23 crc kubenswrapper[4747]: E0930 19:39:23.087707 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:39:34 crc kubenswrapper[4747]: I0930 19:39:34.087461 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:39:34 crc kubenswrapper[4747]: E0930 19:39:34.096567 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:39:49 crc kubenswrapper[4747]: I0930 19:39:49.087711 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:39:49 crc kubenswrapper[4747]: E0930 19:39:49.088754 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:40:00 crc kubenswrapper[4747]: I0930 19:40:00.087857 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:40:00 crc kubenswrapper[4747]: E0930 19:40:00.088967 4747 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pkmxs_openshift-machine-config-operator(a3fce119-955f-405b-bfb3-96aa4b34aef7)\"" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" podUID="a3fce119-955f-405b-bfb3-96aa4b34aef7" Sep 30 19:40:14 crc kubenswrapper[4747]: I0930 19:40:14.086781 4747 scope.go:117] "RemoveContainer" containerID="285a26b93ccd5e947ca0f06d9d030c8c693ed45f2bbb64b45c4773e9469c9bfc" Sep 30 19:40:14 crc kubenswrapper[4747]: I0930 19:40:14.945476 4747 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pkmxs" event={"ID":"a3fce119-955f-405b-bfb3-96aa4b34aef7","Type":"ContainerStarted","Data":"5036bc1b43bae76a725686e076c6ed15b7677366d5fa9f68e62cc75ef516d0da"}